mcpproxy-go and muster
About mcpproxy-go
smart-mcp-proxy/mcpproxy-go
Supercharge AI Agents, Safely
Implements a federated proxy that aggregates hundreds of MCP servers while intelligently compressing tool schemas through a `retrieve_tools` function, reducing token overhead by ~99% and bypassing client tool limits (Cursor's 40-tool cap, OpenAI's 128-function limit). Built in Go with native system-tray UIs across macOS, Windows, and Linux; supports both stdio and HTTP upstream protocols with optional HTTPS/mTLS, Docker isolation, and automatic tool poisoning quarantine that blocks untrusted servers pending manual approval.
About muster
giantswarm/muster
MCP tool management and workflow proxy
Provides intelligent aggregation of multiple MCP servers through a meta-server architecture, enabling AI agents to dynamically discover and invoke tools without context pollution. Built in Go, it manages MCP server lifecycles, handles prerequisites via ServiceClasses, and persists common tasks as deterministic workflows to reduce costs and latency. Designed for platform engineers integrating Kubernetes, Prometheus, Grafana, and other infrastructure tools with LLM agents in IDEs like VSCode and Cursor.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work