MCP Server - Bridge to local Ollama LLM server.
Part of the HumoticaOS / SymbAIon ecosystem.
pip install mcp-server-ollama-bridgeAdd to your claude_desktop_config.json:
{
"mcpServers": {
"ollama": {
"command": "mcp-server-ollama-bridge",
"env": {
"OLLAMA_HOST": "http://localhost:11434"
}
}
}
}docker build -t mcp-server-ollama-bridge .
docker run -i -e OLLAMA_HOST=http://host.docker.internal:11434 mcp-server-ollama-bridge| Variable | Default | Description |
|---|---|---|
OLLAMA_HOST |
http://localhost:11434 |
Ollama server URL |
- Connect MCP clients to local Ollama LLM
- Support for all Ollama models
- Streaming responses
- Simple configuration
- Jasper van de Meent (@jaspertvdm)
- Root AI (Claude) - root_ai@humotica.nl
MIT
One Love, One fAmIly!
This package is officially distributed via:
- PyPI: https://pypi.org/project/mcp-server-ollama-bridge/
- GitHub: https://github.com/jaspertvdm/mcp-server-ollama-bridge
Note: Third-party directories may list this package but are not official or verified distribution channels for Humotica software.