assafelovic/gptr-mcp
MCP server for enabling LLM applications to perform deep research via the MCP protocol
Wraps the GPT Researcher autonomous research engine as an MCP server with support for multiple transport modes (stdio for Claude Desktop, SSE/HTTP for Docker and web environments). Provides tools for deep web research, quick search, and report generation that autonomously validate sources and filter noise before returning optimized context—addressing LLM context window waste from raw search results. Integrates with Claude Desktop, n8n, and web clients via configurable transport protocols and Docker deployment.
329 stars.
Stars
329
Forks
53
Language
Python
License
MIT
Category
Last pushed
Nov 07, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/assafelovic/gptr-mcp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related servers
wheattoast11/openrouter-deep-research-mcp
A multi-agent research MCP server + mini client adapter - orchestrates a net of async agents or...
andre-inter-collab-llc/research-workflow-assistant
Open-source AI research assistant for VS Code + GitHub Copilot. Connects to PubMed, OpenAlex,...
ssdeanx/deep-research-mcp-server
MCP Deep Research Server using Gemini creating a Research AI Agent
ssdeanx/Mastervolt-Deep-Research
Mastervolt Deep Research is a sophisticated multi-agent orchestration system built on VoltAgent...
The-Obstacle-Is-The-Way/DeepBoner
AI-Powered Deep Research Agent for Sexual Health