gemini-mcp-tool and gemini-mcp
These are complements: the first exposes Gemini's CLI capabilities as an MCP server for any AI assistant, while the second enables Claude specifically to call Gemini's API, allowing users to leverage both models' strengths in a single workflow.
About gemini-mcp-tool
jamubc/gemini-mcp-tool
MCP server that enables AI assistants to interact with Google Gemini CLI, leveraging Gemini's massive token window for large file analysis and codebase understanding
Implements the Model Context Protocol (MCP) to bridge Claude with Gemini CLI, exposing tools like `ask-gemini` and `sandbox-test` for safe code execution and file analysis using the `@` syntax for context injection. Integrates directly into Claude Desktop and Claude Code via stdio transport, defaulting to the `gemini-2.5-pro` model while supporting configurable model selection and sandbox mode for isolated script testing.
About gemini-mcp
RLabs-Inc/gemini-mcp
MCP Server that enables Claude code to interact with Gemini
Implements Gemini 3 models (Pro/Flash) and supporting services through the Model Context Protocol, exposing 20+ tools spanning generative capabilities (images, videos, code execution) and research features (web search, YouTube/document analysis, token counting). Communicates via stdio transport and integrates seamlessly with Claude Code via the official MCP Registry, while offering a standalone CLI for direct access to all functionality. Built with async task polling for video generation and context caching for efficient repeated queries on large documents.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work