ai
Google Gemini MCP Integration for Vercel AI SDK
Connect Google Gemini to Vercel AI SDK through the local KosmoKrator MCP gateway with scoped tools, credentials, and write policy.
Connect Google Gemini to Vercel AI SDK
Use KosmoKrator as a local integration gateway for Vercel AI SDK agents and scripts.
Create an MCP client that starts or connects to the KosmoKrator gateway for the selected integration. The gateway is local, scoped to this integration, and starts with
--write=deny so Vercel AI SDK can inspect read-capable tools without receiving write access by default.
Google Gemini MCP Config for Vercel AI SDK
Prefer CLI JSON calls when a workflow only needs one deterministic integration operation.
{
"mcpServers": {
"kosmokrator-google-gemini": {
"type": "stdio",
"command": "kosmo",
"args": [
"mcp:serve",
"--integration=google-gemini",
"--write=deny"
]
}
}
} Run the Gateway Manually
kosmokrator mcp:serve --integration=google-gemini --write=deny Why Use KosmoKrator Here
Expose only Google Gemini instead of a broad multi-service tool list.
Reuse credentials already configured for the KosmoKrator CLI and Lua runtime.
Start read-only, then opt into ask or allow for trusted workspaces.
Google Gemini Tools Visible to Vercel AI SDK
Vercel AI SDK sees stable MCP tool names generated from the Google Gemini integration catalog.
| MCP tool | Source function | Type | Description |
|---|---|---|---|
integration__google_gemini__gemini_list_models | google-gemini.gemini_list_models | Read | List available Gemini AI models. Returns model names, display names, supported generation methods, and other metadata. |
integration__google_gemini__gemini_get_model | google-gemini.gemini_get_model | Read | Get detailed information about a specific Gemini model, including supported generation methods, input/output token limits, and capabilities. |
integration__google_gemini__gemini_generate_content | google-gemini.gemini_generate_content | Write | Generate content using a Gemini model. Send text prompts and receive AI-generated responses. Supports configurable generation parameters like temperature, topP, and maxOutputTokens. |
integration__google_gemini__gemini_list_files | google-gemini.gemini_list_files | Read | List files uploaded to the Gemini File API. Returns file names, MIME types, sizes, and states. |
integration__google_gemini__gemini_get_file | google-gemini.gemini_get_file | Read | Get metadata for an uploaded file in the Gemini File API, including its name, display name, MIME type, size, and processing state. |
integration__google_gemini__gemini_list_tuned_models | google-gemini.gemini_list_tuned_models | Read | List tuned (fine-tuned) Gemini models in your project. Returns model names, base models, tuning tasks, and display names. |
integration__google_gemini__gemini_get_current_user | google-gemini.gemini_get_current_user | Read | Get information about the currently authenticated Google user, including permissions and account details. |