mcp-llm and locallama-mcp
About mcp-llm
sammcj/mcp-llm
An MCP server that provides LLMs access to other LLMs
Exposes code generation, documentation, and Q&A capabilities through LlamaIndexTS integration, enabling Claude and other MCP clients to programmatically invoke LLM tools for tasks like in-file code insertion and multi-format documentation. Built as a Node.js MCP server compatible with Claude Desktop via Smithery, supporting both stdio transport and direct programmatic access through example scripts.
About locallama-mcp
Heratiki/locallama-mcp
An MCP Server that works with Roo Code/Cline.Bot/Claude Desktop to optimize costs by intelligently routing coding tasks between local LLMs free APIs and paid APIs.
Implements intelligent task routing through a decision engine that analyzes code complexity, dependency mapping, and execution order to minimize token costs—combining real-time API monitoring with benchmarking data from local LLMs (LM Studio, Ollama) and OpenRouter's free/paid models. Features BM25-based semantic code search via Retriv, adaptive model selection with performance history, and preemptive routing that avoids API calls for faster decisions without sacrificing accuracy.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work