wet-mcp and google-ai-mode-mcp
These two MCP servers are complementary: one provides core web search and content extraction functionality, while the other enhances specific Google AI Mode searches with advanced features like query optimization and multi-agent support.
About wet-mcp
n24q02m/wet-mcp
MCP server for web search, content extraction, and documentation indexing
Provides embedded metasearch (SearXNG) with semantic reranking and query expansion, plus specialized academic research across Google Scholar, arXiv, and PubMed. Features local full-text documentation indexing with HyDE-enhanced retrieval, batch content extraction from up to 50 URLs, and multimodal analysis—all with zero-config local embeddings (Qwen3) or optional cloud providers. Integrates as an MCP server with Claude, Gemini, and Codex via stdio transport, with automatic setup and encrypted credential storage.
About google-ai-mode-mcp
PleasePrompto/google-ai-mode-mcp
MCP server for free Google AI Mode search with citations. Query optimization, CAPTCHA handling, multi-agent support. Works with Claude Code, Cursor, Cline, Windsurf.
Leverages Puppeteer with stealth techniques and persistent browser profiles to automate Google AI Mode searches, using a 4-stage completion detection system (SVG thumbs-up → aria-label → text → timeout) that achieves 87% faster results across multiple languages. Extracts AI-synthesized answers and citations via 17 language-agnostic selectors, then converts to Markdown with inline references—eliminating the token cost of manual multi-page research while preserving source attribution. Exposes a single MCP tool callable from any agent (Claude, Cursor, Cline, Windsurf, Zed) with optional file persistence and automatic CAPTCHA handling via visible browser fallback.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work