gemini-mcp-tool and mcp-gemini-server
These are functional competitors—both expose Gemini's capabilities as MCP tools, but A bridges to Gemini's CLI for file analysis while B wraps the @google/genai SDK for direct model access, so you'd choose one based on whether you need CLI-based file processing (A) or programmatic SDK integration (B).
About gemini-mcp-tool
jamubc/gemini-mcp-tool
MCP server that enables AI assistants to interact with Google Gemini CLI, leveraging Gemini's massive token window for large file analysis and codebase understanding
Implements the Model Context Protocol (MCP) to bridge Claude with Gemini CLI, exposing tools like `ask-gemini` and `sandbox-test` for safe code execution and file analysis using the `@` syntax for context injection. Integrates directly into Claude Desktop and Claude Code via stdio transport, defaulting to the `gemini-2.5-pro` model while supporting configurable model selection and sandbox mode for isolated script testing.
About mcp-gemini-server
bsmi021/mcp-gemini-server
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work