gptr-mcp and deep-research-mcp-server
About gptr-mcp
assafelovic/gptr-mcp
MCP server for enabling LLM applications to perform deep research via the MCP protocol
Wraps the GPT Researcher autonomous research engine as an MCP server with support for multiple transport modes (stdio for Claude Desktop, SSE/HTTP for Docker and web environments). Provides tools for deep web research, quick search, and report generation that autonomously validate sources and filter noise before returning optimized context—addressing LLM context window waste from raw search results. Integrates with Claude Desktop, n8n, and web clients via configurable transport protocols and Docker deployment.
About deep-research-mcp-server
ssdeanx/deep-research-mcp-server
MCP Deep Research Server using Gemini creating a Research AI Agent
Implements iterative deep research through a feedback loop: refining SERP queries based on prior learnings, processing results with semantic chunking and batched Gemini calls, then recursively exploring new directions until depth limits are reached. Built on Gemini 2.5 Flash with optional Google Search Grounding and structured JSON outputs validated via Zod, packaged as an MCP server for seamless integration with Claude and other agent frameworks.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work