gemini-mcp and claude_code-gemini-mcp
These are competing implementations of the same integration pattern—both provide MCP servers enabling Claude to interact with Gemini's API, with the latter offering a simplified alternative to the former.
About gemini-mcp
RLabs-Inc/gemini-mcp
MCP Server that enables Claude code to interact with Gemini
Implements Gemini 3 models (Pro/Flash) and supporting services through the Model Context Protocol, exposing 20+ tools spanning generative capabilities (images, videos, code execution) and research features (web search, YouTube/document analysis, token counting). Communicates via stdio transport and integrates seamlessly with Claude Code via the official MCP Registry, while offering a standalone CLI for direct access to all functionality. Built with async task polling for video generation and context caching for efficient repeated queries on large documents.
About claude_code-gemini-mcp
RaiAnsar/claude_code-gemini-mcp
Simplified Gemini for Claude Code.
Implements an MCP (Model Context Protocol) server that exposes Gemini's capabilities—code review, brainstorming, and general Q&A—as Claude Code tools, bridging two AI systems through a unified interface. Uses the Google Generative AI Python SDK and installs globally to `~/.claude-mcp-servers/`, making Gemini accessible across any Claude Code workspace without per-project configuration.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work