wet-mcp and web-research-assistant
Both are MCP servers for web search, but **elad12390/web-research-assistant** offers broader functionality by integrating with SearXNG and including tools for package info, GitHub, error translation, and API docs, while **n24q02m/wet-mcp** is more focused on content extraction and documentation indexing.
About wet-mcp
n24q02m/wet-mcp
MCP server for web search, content extraction, and documentation indexing
Provides embedded metasearch (SearXNG) with semantic reranking and query expansion, plus specialized academic research across Google Scholar, arXiv, and PubMed. Features local full-text documentation indexing with HyDE-enhanced retrieval, batch content extraction from up to 50 URLs, and multimodal analysis—all with zero-config local embeddings (Qwen3) or optional cloud providers. Integrates as an MCP server with Claude, Gemini, and Codex via stdio transport, with automatic setup and encrypted credential storage.
About web-research-assistant
elad12390/web-research-assistant
MCP server for SearXNG with 13 production-ready tools for web search, package info, GitHub integration, error translation, API docs, and more
Implements the Model Context Protocol over stdio for seamless Claude Desktop and OpenCode integration, with configurable backends including local SearXNG, Exa AI neural search, crawl4ai for content extraction, and Pixabay for images. Exposes 4 MCP resources for direct data lookups (packages, repos, service status, changelogs) and 5 reusable prompt templates alongside the 13 tools, enabling AI agents to conduct structured research workflows with automatic response size limits and usage tracking.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work