Ollama
Freeby AI Labs
Local LLM management with model pull/push/list, multimodal support, and private inference.
v1.0.0Added Jan 18, 2025
local-llmprivacyinference
Ollama MCP Server
Local LLM management with model pull/push/list, multimodal support, and private inference.
Features
- Local LLMs
- Model management
- Multimodal
- Privacy-first
- Custom models
Installation
{
"mcpServers": {
"ollama-mcp": {
"command": "npx",
"args": ["-y", "ollama-mcp"]
}
}
}
Reviews
Installation
Quick install
npx -y ollama-mcp
Add to claude_desktop_config.json
{
"mcpServers": {
"ollama": {
"command": "npx",
"args": ["-y", "ollama-mcp"]
}
}
}