Mattbusel/tokio-prompt-orchestrator
Multi-core, Tokio-native orchestration for LLM pipelines.
Provides a five-stage bounded-backpressure DAG with deduplication, circuit breakers, rate limiting, prompt injection detection, and provider arbitrage across Anthropic, OpenAI, llama.cpp, and vLLM backends. Features a plugin system for request/response filtering, session management with context windowing, streaming token aggregation, and composable prompt transformation pipelines. Exposes REST, WebSocket, SSE, MCP, Prometheus metrics, and OpenTelemetry tracing for production LLM inference workloads.
Stars
50
Forks
4
Language
Rust
License
MIT
Category
Last pushed
Mar 10, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/Mattbusel/tokio-prompt-orchestrator"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mcp-use/mcp-use
The fullstack MCP framework to develop MCP Apps for ChatGPT / Claude & MCP Servers for AI Agents.
TencentCloudBase/CloudBase-MCP
CloudBase MCP - Connect CloudBase to your AI Agent. Go from AI prompt to live app.
rusiaaman/wcgw
Shell and coding agent on mcp clients
casibase/casibase
⚡️AI Cloud OS: Open-source enterprise-level AI knowledge base and MCP...
juspay/neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and...