gptr-mcp and deep-research-mcp-server

gptr-mcp
52
Established
Maintenance 6/25
Adoption 10/25
Maturity 16/25
Community 20/25
Maintenance 2/25
Adoption 8/25
Maturity 16/25
Community 19/25
Stars: 329
Forks: 53
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 68
Forks: 17
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No Package No Dependents
Stale 6m No Package No Dependents

About gptr-mcp

assafelovic/gptr-mcp

MCP server for enabling LLM applications to perform deep research via the MCP protocol

Wraps the GPT Researcher autonomous research engine as an MCP server with support for multiple transport modes (stdio for Claude Desktop, SSE/HTTP for Docker and web environments). Provides tools for deep web research, quick search, and report generation that autonomously validate sources and filter noise before returning optimized context—addressing LLM context window waste from raw search results. Integrates with Claude Desktop, n8n, and web clients via configurable transport protocols and Docker deployment.

About deep-research-mcp-server

ssdeanx/deep-research-mcp-server

MCP Deep Research Server using Gemini creating a Research AI Agent

Implements iterative deep research through a feedback loop: refining SERP queries based on prior learnings, processing results with semantic chunking and batched Gemini calls, then recursively exploring new directions until depth limits are reached. Built on Gemini 2.5 Flash with optional Google Search Grounding and structured JSON outputs validated via Zod, packaged as an MCP server for seamless integration with Claude and other agent frameworks.

Scores updated daily from GitHub, PyPI, and npm data. How scores work