KosmoKrator

ai

Hugging Face Lua API for KosmoKrator Agents

Agent-facing Lua documentation and function reference for the Hugging Face KosmoKrator integration.

6 functions 5 read 1 write Bearer token auth

Lua Namespace

Agents call this integration through app.integrations.hugging_face.*. Use lua_read_doc("integrations.hugging-face") inside KosmoKrator to discover the same reference at runtime.

Agent-Facing Lua Docs

This is the rendered version of the full Lua documentation exposed to agents when they inspect the integration namespace.

Hugging Face — Lua API Reference

list_models

Search and list AI models on the Hugging Face Hub.

Parameters

NameTypeRequiredDescription
searchstringnoSearch query to filter models
authorstringnoFilter by organization or user
taskstringnoFilter by pipeline task (e.g. "text-generation", "image-classification")
tagsarraynoFilter by tags (e.g. {"pytorch", "safetensors"})
sortstringnoSort order: "downloads", "likes", "lastModified", "created"
directionstringnoSort direction: "asc" or "desc"
limitintegernoResults per page (default: 20, max: 500)
offsetintegernoOffset for pagination

Common Tasks

text-generation, text2text-generation, text-classification, token-classification, question-answering, summarization, translation, image-classification, image-generation, automatic-speech-recognition, feature-extraction, sentence-similarity

Examples

-- Top text-generation models
local result = app.integrations["hugging-face"].list_models({
  task = "text-generation",
  sort = "downloads",
  limit = 10
})

for _, model in ipairs(result) do
  print(model.id .. " — " .. model.downloads .. " downloads")
end
-- Search for a specific model
local result = app.integrations["hugging-face"].list_models({
  search = "bert-base",
  sort = "downloads",
  limit = 5
})

get_model

Get detailed information about a specific model.

Parameters

NameTypeRequiredDescription
model_idstringyesThe model ID (e.g. "meta-llama/Llama-3.3-70B-Instruct")

Example

local model = app.integrations["hugging-face"].get_model({
  model_id = "meta-llama/Llama-3.3-70B-Instruct"
})

print(model.id)
print(model.pipeline_tag)
print(model.downloads .. " downloads")
print(model.likes .. " likes")

list_datasets

Search and list datasets on the Hugging Face Hub.

Parameters

NameTypeRequiredDescription
searchstringnoSearch query to filter datasets
authorstringnoFilter by organization or user
tagsarraynoFilter by tags
sortstringnoSort order: "downloads", "likes", "lastModified", "created"
directionstringnoSort direction: "asc" or "desc"
limitintegernoResults per page (default: 20, max: 500)
offsetintegernoOffset for pagination

Example

local datasets = app.integrations["hugging-face"].list_datasets({
  search = "sentiment",
  sort = "downloads",
  limit = 10
})

for _, ds in ipairs(datasets) do
  print(ds.id .. " — " .. ds.downloads .. " downloads")
end

inference

Run inference on a model via the Hugging Face Inference API.

Parameters

NameTypeRequiredDescription
model_idstringyesThe model ID to run inference on
inputsstringnoInput text for the model (for text tasks)
parametersobjectnoModel-specific parameters (e.g. max_new_tokens, temperature)
datastringnoBase64-encoded data for image/audio tasks

Common Parameters for Text Generation

ParameterTypeDescription
max_new_tokensintegerMaximum tokens to generate
temperaturenumberSampling temperature (0.0–2.0)
top_pnumberNucleus sampling threshold
top_kintegerTop-k sampling
repetition_penaltynumberPenalty for repeated tokens
do_samplebooleanEnable sampling (true) or greedy decoding (false)

Examples

-- Text generation
local result = app.integrations["hugging-face"].inference({
  model_id = "meta-llama/Llama-3.3-70B-Instruct",
  inputs = "What is the meaning of life?",
  parameters = {
    max_new_tokens = 100,
    temperature = 0.7
  }
})

for _, item in ipairs(result) do
  print(item.generated_text)
end
-- Summarization
local result = app.integrations["hugging-face"].inference({
  model_id = "facebook/bart-large-cnn",
  inputs = "The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey building..."
})

for _, item in ipairs(result) do
  print(item.summary_text)
end
-- Text classification (sentiment)
local result = app.integrations["hugging-face"].inference({
  model_id = "distilbert-base-uncased-finetuned-sst-2-english",
  inputs = "I love using Hugging Face!"
})

for _, item in ipairs(result) do
  for _, label in ipairs(item) do
    print(label.label .. ": " .. label.score)
  end
end

list_spaces

Search and list Spaces on the Hugging Face Hub.

Parameters

NameTypeRequiredDescription
searchstringnoSearch query to filter Spaces
authorstringnoFilter by organization or user
tagsarraynoFilter by tags
sortstringnoSort order: "downloads", "likes", "lastModified", "created"
directionstringnoSort direction: "asc" or "desc"
limitintegernoResults per page (default: 20, max: 500)
offsetintegernoOffset for pagination

Example

local spaces = app.integrations["hugging-face"].list_spaces({
  search = "chat",
  sort = "likes",
  limit = 10
})

for _, space in ipairs(spaces) do
  print(space.id .. " — " .. (space.likes or 0) .. " likes")
end

get_current_user

Get the authenticated user’s profile information.

Parameters

None.

Example

local user = app.integrations["hugging-face"].get_current_user({})

print("Name: " .. user.fullname)
print("Username: " .. user.name)
print("Type: " .. user.type)

Multi-Account Usage

If you have multiple Hugging Face accounts configured, use account-specific namespaces:

-- Default account (always works)
app.integrations["hugging-face"].list_models({})

-- Explicit default (portable across setups)
app.integrations["hugging-face"].default.list_models({})

-- Named accounts
app.integrations["hugging-face"].work.list_models({})
app.integrations["hugging-face"].research.list_models({})

All functions are identical across accounts — only the credentials differ.

Raw agent markdown
# Hugging Face — Lua API Reference

## list_models

Search and list AI models on the Hugging Face Hub.

### Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `search` | string | no | Search query to filter models |
| `author` | string | no | Filter by organization or user |
| `task` | string | no | Filter by pipeline task (e.g. `"text-generation"`, `"image-classification"`) |
| `tags` | array | no | Filter by tags (e.g. `{"pytorch", "safetensors"}`) |
| `sort` | string | no | Sort order: `"downloads"`, `"likes"`, `"lastModified"`, `"created"` |
| `direction` | string | no | Sort direction: `"asc"` or `"desc"` |
| `limit` | integer | no | Results per page (default: 20, max: 500) |
| `offset` | integer | no | Offset for pagination |

### Common Tasks

`text-generation`, `text2text-generation`, `text-classification`, `token-classification`, `question-answering`, `summarization`, `translation`, `image-classification`, `image-generation`, `automatic-speech-recognition`, `feature-extraction`, `sentence-similarity`

### Examples

```lua
-- Top text-generation models
local result = app.integrations["hugging-face"].list_models({
  task = "text-generation",
  sort = "downloads",
  limit = 10
})

for _, model in ipairs(result) do
  print(model.id .. " — " .. model.downloads .. " downloads")
end
```

```lua
-- Search for a specific model
local result = app.integrations["hugging-face"].list_models({
  search = "bert-base",
  sort = "downloads",
  limit = 5
})
```

---

## get_model

Get detailed information about a specific model.

### Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `model_id` | string | yes | The model ID (e.g. `"meta-llama/Llama-3.3-70B-Instruct"`) |

### Example

```lua
local model = app.integrations["hugging-face"].get_model({
  model_id = "meta-llama/Llama-3.3-70B-Instruct"
})

print(model.id)
print(model.pipeline_tag)
print(model.downloads .. " downloads")
print(model.likes .. " likes")
```

---

## list_datasets

Search and list datasets on the Hugging Face Hub.

### Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `search` | string | no | Search query to filter datasets |
| `author` | string | no | Filter by organization or user |
| `tags` | array | no | Filter by tags |
| `sort` | string | no | Sort order: `"downloads"`, `"likes"`, `"lastModified"`, `"created"` |
| `direction` | string | no | Sort direction: `"asc"` or `"desc"` |
| `limit` | integer | no | Results per page (default: 20, max: 500) |
| `offset` | integer | no | Offset for pagination |

### Example

```lua
local datasets = app.integrations["hugging-face"].list_datasets({
  search = "sentiment",
  sort = "downloads",
  limit = 10
})

for _, ds in ipairs(datasets) do
  print(ds.id .. " — " .. ds.downloads .. " downloads")
end
```

---

## inference

Run inference on a model via the Hugging Face Inference API.

### Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `model_id` | string | yes | The model ID to run inference on |
| `inputs` | string | no | Input text for the model (for text tasks) |
| `parameters` | object | no | Model-specific parameters (e.g. `max_new_tokens`, `temperature`) |
| `data` | string | no | Base64-encoded data for image/audio tasks |

### Common Parameters for Text Generation

| Parameter | Type | Description |
|-----------|------|-------------|
| `max_new_tokens` | integer | Maximum tokens to generate |
| `temperature` | number | Sampling temperature (0.0–2.0) |
| `top_p` | number | Nucleus sampling threshold |
| `top_k` | integer | Top-k sampling |
| `repetition_penalty` | number | Penalty for repeated tokens |
| `do_sample` | boolean | Enable sampling (true) or greedy decoding (false) |

### Examples

```lua
-- Text generation
local result = app.integrations["hugging-face"].inference({
  model_id = "meta-llama/Llama-3.3-70B-Instruct",
  inputs = "What is the meaning of life?",
  parameters = {
    max_new_tokens = 100,
    temperature = 0.7
  }
})

for _, item in ipairs(result) do
  print(item.generated_text)
end
```

```lua
-- Summarization
local result = app.integrations["hugging-face"].inference({
  model_id = "facebook/bart-large-cnn",
  inputs = "The tower is 324 metres (1,063 ft) tall, about the same height as an 81-storey building..."
})

for _, item in ipairs(result) do
  print(item.summary_text)
end
```

```lua
-- Text classification (sentiment)
local result = app.integrations["hugging-face"].inference({
  model_id = "distilbert-base-uncased-finetuned-sst-2-english",
  inputs = "I love using Hugging Face!"
})

for _, item in ipairs(result) do
  for _, label in ipairs(item) do
    print(label.label .. ": " .. label.score)
  end
end
```

---

## list_spaces

Search and list Spaces on the Hugging Face Hub.

### Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `search` | string | no | Search query to filter Spaces |
| `author` | string | no | Filter by organization or user |
| `tags` | array | no | Filter by tags |
| `sort` | string | no | Sort order: `"downloads"`, `"likes"`, `"lastModified"`, `"created"` |
| `direction` | string | no | Sort direction: `"asc"` or `"desc"` |
| `limit` | integer | no | Results per page (default: 20, max: 500) |
| `offset` | integer | no | Offset for pagination |

### Example

```lua
local spaces = app.integrations["hugging-face"].list_spaces({
  search = "chat",
  sort = "likes",
  limit = 10
})

for _, space in ipairs(spaces) do
  print(space.id .. " — " .. (space.likes or 0) .. " likes")
end
```

---

## get_current_user

Get the authenticated user's profile information.

### Parameters

None.

### Example

```lua
local user = app.integrations["hugging-face"].get_current_user({})

print("Name: " .. user.fullname)
print("Username: " .. user.name)
print("Type: " .. user.type)
```

---

## Multi-Account Usage

If you have multiple Hugging Face accounts configured, use account-specific namespaces:

```lua
-- Default account (always works)
app.integrations["hugging-face"].list_models({})

-- Explicit default (portable across setups)
app.integrations["hugging-face"].default.list_models({})

-- Named accounts
app.integrations["hugging-face"].work.list_models({})
app.integrations["hugging-face"].research.list_models({})
```

All functions are identical across accounts — only the credentials differ.

Metadata-Derived Lua Example

local result = app.integrations.hugging_face.huggingface_list_models({
  search = "example_search",
  author = "example_author",
  task = "example_task",
  tags = "example_tags",
  sort = "example_sort",
  direction = "example_direction",
  limit = 1,
  offset = 1
})
print(result)

Functions

huggingface_list_models

Search and list AI models on the Hugging Face Hub. Filter by text search, author, task (e.g. "text-generation", "image-classification"), tags, and sort by downloads, likes, or recent activity.

Operation
Read read
Full name
hugging-face.huggingface_list_models
ParameterTypeRequiredDescription
search string no Search query to filter models by name or description.
author string no Filter by organization or user (e.g. "meta-llama", "openai").
task string no Filter by pipeline task (e.g. "text-generation", "text-classification", "image-classification", "automatic-speech-recognition").
tags array no Filter by tags (e.g. ["pytorch", "safetensors"]).
sort string no Sort order: "downloads", "likes", "lastModified", "created". Defaults to "downloads".
direction string no Sort direction: "asc" or "desc". Defaults to "desc".
limit integer no Number of results per page (default: 20, max: 500).
offset integer no Offset for pagination.

huggingface_get_model

Get detailed information about a specific Hugging Face model, including its card, tags, pipeline tag, library, downloads, likes, and file listing.

Operation
Read read
Full name
hugging-face.huggingface_get_model
ParameterTypeRequiredDescription
model_id string yes The model ID (e.g. "meta-llama/Llama-3.3-70B-Instruct", "bert-base-uncased").

huggingface_list_datasets

Search and list datasets on the Hugging Face Hub. Filter by text search, author, tags, and sort by downloads, likes, or recent activity.

Operation
Read read
Full name
hugging-face.huggingface_list_datasets
ParameterTypeRequiredDescription
search string no Search query to filter datasets by name or description.
author string no Filter by organization or user (e.g. "HuggingFaceFW", "mozilla-foundation").
tags array no Filter by tags (e.g. ["text-classification", "english"]).
sort string no Sort order: "downloads", "likes", "lastModified", "created". Defaults to "downloads".
direction string no Sort direction: "asc" or "desc". Defaults to "desc".
limit integer no Number of results per page (default: 20, max: 500).
offset integer no Offset for pagination.

huggingface_inference

Run inference on a Hugging Face model via the serverless Inference API. Supports text generation, summarization, translation, classification, image analysis, and more. The payload structure depends on the model's task — refer to the Hugging Face Inference API docs for model-specific formats.

Operation
Write write
Full name
hugging-face.huggingface_inference
ParameterTypeRequiredDescription
model_id string yes The model ID to run inference on (e.g. "meta-llama/Llama-3.3-70B-Instruct", "facebook/bart-large-cnn").
inputs string no The input text or data for the model. For text tasks, this is the prompt or text to process.
parameters object no Model-specific parameters (e.g. {"max_new_tokens": 256, "temperature": 0.7, "top_p": 0.95} for text generation).
data string no Base64-encoded data for image/audio tasks. Use this instead of "inputs" for non-text models.

huggingface_list_spaces

Search and list Spaces on the Hugging Face Hub. Filter by text search, author, tags, SDK, and sort by downloads, likes, or recent activity.

Operation
Read read
Full name
hugging-face.huggingface_list_spaces
ParameterTypeRequiredDescription
search string no Search query to filter Spaces by name or description.
author string no Filter by organization or user (e.g. "gradio", "stabilityai").
tags array no Filter by tags (e.g. ["gradio", "text-generation"]).
sort string no Sort order: "downloads", "likes", "lastModified", "created". Defaults to "downloads".
direction string no Sort direction: "asc" or "desc". Defaults to "desc".
limit integer no Number of results per page (default: 20, max: 500).
offset integer no Offset for pagination.

huggingface_get_current_user

Get the authenticated Hugging Face user's profile information, including name, username, type (user/org), and avatar.

Operation
Read read
Full name
hugging-face.huggingface_get_current_user
ParameterTypeRequiredDescription
No parameters.