ollama-mcp-bridge and cross-llm-mcp

ollama-mcp-bridge
71
Verified
cross-llm-mcp
44
Emerging
Maintenance 13/25
Adoption 15/25
Maturity 24/25
Community 19/25
Maintenance 6/25
Adoption 5/25
Maturity 18/25
Community 15/25
Stars: 67
Forks: 21
Downloads: 1,711
Commits (30d): 0
Language: Python
License: MIT
Stars: 13
Forks: 5
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No risk flags
No risk flags

About ollama-mcp-bridge

jonigl/ollama-mcp-bridge

Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots

Intercepts Ollama `/api/chat` requests through a FastAPI proxy layer, automatically aggregates tools from multiple MCP servers configured via JSON (supporting stdio, HTTP, and SSE transports), and orchestrates multi-round tool execution with configurable limits. Transparently proxies all other Ollama API endpoints while injecting discovered tools into chat requests, enabling models to seamlessly invoke capabilities from disparate MCP servers in a single conversation.

About cross-llm-mcp

JamesANZ/cross-llm-mcp

A Model Context Protocol (MCP) server that provides access to multiple Large Language Model (LLM) APIs including ChatGPT, Claude, Gemini, Mistral, Kimi K2, and DeepSeek.

Implements intelligent model selection through tag-based preferences (coding, reasoning, math, etc.) and persistent prompt logging with analytics, enabling cost-optimized multi-LLM workflows. Built on the MCP SDK with environment variable configuration, it integrates directly into Cursor and Claude Desktop via stdio transport, while storing preferences and prompt histories locally as JSON files for offline access and analytics.

Scores updated daily from GitHub, PyPI, and npm data. How scores work