gemini-mcp-tool and gopher-mcp
These tools are complements: the first bridges AI assistants to Google Gemini's CLI for file analysis, while the second enables those same assistants to browse Gemini protocol resources, allowing them to work together in a unified workflow for both local and remote Gemini interactions.
About gemini-mcp-tool
jamubc/gemini-mcp-tool
MCP server that enables AI assistants to interact with Google Gemini CLI, leveraging Gemini's massive token window for large file analysis and codebase understanding
Implements the Model Context Protocol (MCP) to bridge Claude with Gemini CLI, exposing tools like `ask-gemini` and `sandbox-test` for safe code execution and file analysis using the `@` syntax for context injection. Integrates directly into Claude Desktop and Claude Code via stdio transport, defaulting to the `gemini-2.5-pro` model while supporting configurable model selection and sandbox mode for isolated script testing.
About gopher-mcp
cameronrye/gopher-mcp
A modern, cross-platform Model Context Protocol (MCP) server that enables AI assistants to browse and interact with both Gopher protocol and Gemini protocol resources safely and efficiently.
Built on the FastMCP framework with async/await patterns, it provides `gopher_fetch` and `gemini_fetch` tools that return structured JSON responses optimized for LLM consumption. Implements advanced security through TOFU certificate validation, client certificate support, timeouts, size limits, and host allowlists. Integrates directly with Claude Desktop via stdio transport and supports HTTP-based MCP servers for broader ecosystem compatibility.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work