gptr-mcp and openrouter-deep-research-mcp

gptr-mcp
52
Established
Maintenance 6/25
Adoption 10/25
Maturity 16/25
Community 20/25
Maintenance 10/25
Adoption 8/25
Maturity 16/25
Community 17/25
Stars: 329
Forks: 53
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 42
Forks: 11
Downloads:
Commits (30d): 0
Language: JavaScript
License: MIT
No Package No Dependents
No Package No Dependents

About gptr-mcp

assafelovic/gptr-mcp

MCP server for enabling LLM applications to perform deep research via the MCP protocol

Wraps the GPT Researcher autonomous research engine as an MCP server with support for multiple transport modes (stdio for Claude Desktop, SSE/HTTP for Docker and web environments). Provides tools for deep web research, quick search, and report generation that autonomously validate sources and filter noise before returning optimized context—addressing LLM context window waste from raw search results. Integrates with Claude Desktop, n8n, and web clients via configurable transport protocols and Docker deployment.

About openrouter-deep-research-mcp

wheattoast11/openrouter-deep-research-mcp

A multi-agent research MCP server + mini client adapter - orchestrates a net of async agents or streaming swarm to conduct ensemble consensus-backed research. Each task builds its own indexed pglite database on the fly in web assembly. Includes semantic + hybrid search, SQL execution, semaphores, prompts/resources and more

Implements consensus-driven research via agents routed through OpenRouter's LLM endpoint, with tools for hybrid search (BM25 + vector), SQL queries on ephemeral PGLite instances, knowledge graph traversal, and session checkpointing. Supports STDIO (MCP spec default) and HTTP transports with circuit-breaker fault tolerance across multiple API keys; embedding-based model routing matches queries to domain-optimized LLM tiers without extra API calls, while persistent reporting and undo/fork capabilities enable reproducible research workflows across Claude Desktop, Jan AI, Continue, and other MCP clients.

Scores updated daily from GitHub, PyPI, and npm data. How scores work