Use the Hugging Face CLI from KosmoKrator to call Hugging Face tools headlessly, return JSON, inspect schemas, and automate workflows from coding agents, scripts, and CI.
Hugging Face can be configured headlessly with `kosmokrator integrations:configure hugging-face`.
# Install KosmoKrator first if it is not available on PATH.curl -fsSL https://raw.githubusercontent.com/OpenCompanyApp/kosmokrator/main/install.sh | bash# Configure and verify this integration.kosmokrator integrations:configure hugging-face --set access_token="$HUGGING_FACE_ACCESS_TOKEN" --enable --read allow --write ask --jsonkosmokrator integrations:doctor hugging-face --jsonkosmokrator integrations:status --json
Credentials
Authentication type: Bearer tokenbearer_token. Configure credentials once, then use the same stored profile from
scripts, coding CLIs, Lua code mode, and the MCP gateway.
Key
Env var
Type
Required
Label
access_token
HUGGING_FACE_ACCESS_TOKEN
Secret secret
yes
Access Token
url
HUGGING_FACE_URL
URL url
no
API Base URL
Call Hugging Face Headlessly
Use the generic call form when another coding CLI or script needs a stable universal interface.
Every function below can be called headlessly. The generic form is stable across all integrations;
the provider shortcut is shorter but specific to Hugging Face.
hugging-face.huggingface_list_models
Read read
Search and list AI models on the Hugging Face Hub. Filter by text search, author, task (e.g. "text-generation", "image-classification"), tags, and sort by downloads, likes, or recent activity.
Run inference on a Hugging Face model via the serverless Inference API. Supports text generation, summarization, translation, classification, image analysis, and more. The payload structure depends on the model's task — refer to the Hugging Face Inference API docs for model-specific formats.
Use these parameter tables when building CLI payloads without calling integrations:schema first.
hugging-face.huggingface_list_models
Search and list AI models on the Hugging Face Hub. Filter by text search, author, task (e.g. "text-generation", "image-classification"), tags, and sort by downloads, likes, or recent activity.
Search query to filter datasets by name or description.
author
string
no
Filter by organization or user (e.g. "HuggingFaceFW", "mozilla-foundation").
tags
array
no
Filter by tags (e.g. ["text-classification", "english"]).
sort
string
no
Sort order: "downloads", "likes", "lastModified", "created". Defaults to "downloads".
direction
string
no
Sort direction: "asc" or "desc". Defaults to "desc".
limit
integer
no
Number of results per page (default: 20, max: 500).
offset
integer
no
Offset for pagination.
hugging-face.huggingface_inference
Run inference on a Hugging Face model via the serverless Inference API. Supports text generation, summarization, translation, classification, image analysis, and more. The payload structure depends on the model's task — refer to the Hugging Face Inference API docs for model-specific formats.
Headless calls still follow the integration read/write permission policy. Configure read/write defaults
with integrations:configure. Add --force only for trusted automation that should bypass that policy.