ai
Hugging Face CLI for CI
Use the Hugging Face CLI for CI with headless JSON commands, schema discovery, credentials, and permission controls.
6 functions 5 read 1 write Bearer token auth
Hugging Face CLI for CI
Run integration calls from CI jobs with JSON output, explicit credentials, and predictable exit status.
Use this shape when a pipeline needs to read or update an external service. The Hugging Face CLI uses the same integration registry as the TUI, Lua runtime, and MCP gateway, but returns predictable command output for automation.
Command Shape
# Hugging Face CLI for CI
kosmokrator integrations:configure hugging-face --set access_token="$HUGGING_FACE_ACCESS_TOKEN" --enable --read allow --write ask --json
kosmo integrations:call hugging-face.huggingface_list_models '{"search":"example_search","author":"example_author","task":"example_task","tags":"example_tags","sort":"example_sort","direction":"example_direction","limit":1,"offset":1}' --json Discovery Before Execution
Agents and scripts can inspect Hugging Face docs and schemas before choosing a function.
kosmo integrations:docs hugging-face --json
kosmo integrations:docs hugging-face.huggingface_list_models --json
kosmo integrations:schema hugging-face.huggingface_list_models --json
kosmo integrations:search "Hugging Face" --json
kosmo integrations:list --json Useful Hugging Face CLI Functions
| Function | Type | Parameters | Description |
|---|---|---|---|
hugging-face.huggingface_list_models | Read | search, author, task, tags, sort, direction, limit, offset | Search and list AI models on the Hugging Face Hub. Filter by text search, author, task (e.g. "text-generation", "image-classification"), tags, and sort by downloads, likes, or recent activity. |
hugging-face.huggingface_get_model | Read | model_id | Get detailed information about a specific Hugging Face model, including its card, tags, pipeline tag, library, downloads, likes, and file listing. |
hugging-face.huggingface_list_datasets | Read | search, author, tags, sort, direction, limit, offset | Search and list datasets on the Hugging Face Hub. Filter by text search, author, tags, and sort by downloads, likes, or recent activity. |
hugging-face.huggingface_inference | Write | model_id, inputs, parameters, data | Run inference on a Hugging Face model via the serverless Inference API. Supports text generation, summarization, translation, classification, image analysis, and more. The payload structure depends on the model's task — refer to the Hugging Face Inference API docs for model-specific formats. |
hugging-face.huggingface_list_spaces | Read | search, author, tags, sort, direction, limit, offset | Search and list Spaces on the Hugging Face Hub. Filter by text search, author, tags, SDK, and sort by downloads, likes, or recent activity. |
hugging-face.huggingface_get_current_user | Read | none | Get the authenticated Hugging Face user's profile information, including name, username, type (user/org), and avatar. |
Automation Notes
- Use
--jsonfor machine-readable output. - Keep credentials out of argv by using environment variables or stored KosmoKrator configuration.
- Configure read/write policy before unattended runs; use
--forceonly for trusted automation. - Use the MCP gateway instead when the agent needs dynamic tool discovery inside a conversation.