ollama-mcp-bridge and cross-llm-mcp
About ollama-mcp-bridge
jonigl/ollama-mcp-bridge
Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots
Intercepts Ollama `/api/chat` requests through a FastAPI proxy layer, automatically aggregates tools from multiple MCP servers configured via JSON (supporting stdio, HTTP, and SSE transports), and orchestrates multi-round tool execution with configurable limits. Transparently proxies all other Ollama API endpoints while injecting discovered tools into chat requests, enabling models to seamlessly invoke capabilities from disparate MCP servers in a single conversation.
About cross-llm-mcp
JamesANZ/cross-llm-mcp
A Model Context Protocol (MCP) server that provides access to multiple Large Language Model (LLM) APIs including ChatGPT, Claude, Gemini, Mistral, Kimi K2, and DeepSeek.
Implements intelligent model selection through tag-based preferences (coding, reasoning, math, etc.) and persistent prompt logging with analytics, enabling cost-optimized multi-LLM workflows. Built on the MCP SDK with environment variable configuration, it integrates directly into Cursor and Claude Desktop via stdio transport, while storing preferences and prompt histories locally as JSON files for offline access and analytics.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work