Mattbusel/tokio-prompt-orchestrator

Multi-core, Tokio-native orchestration for LLM pipelines.

39
/ 100
Emerging

Provides a five-stage bounded-backpressure DAG with deduplication, circuit breakers, rate limiting, prompt injection detection, and provider arbitrage across Anthropic, OpenAI, llama.cpp, and vLLM backends. Features a plugin system for request/response filtering, session management with context windowing, streaming token aggregation, and composable prompt transformation pipelines. Exposes REST, WebSocket, SSE, MCP, Prometheus metrics, and OpenTelemetry tracing for production LLM inference workloads.

No Package No Dependents
Maintenance 13 / 25
Adoption 8 / 25
Maturity 9 / 25
Community 9 / 25

How are scores calculated?

Stars

50

Forks

4

Language

Rust

License

MIT

Last pushed

Mar 10, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/Mattbusel/tokio-prompt-orchestrator"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.