RouteWorks/RouterArena
RouterArena: An open framework for evaluating LLM routers with standardized datasets, metrics, an automated framework, and a live leaderboard.
The platform evaluates routers across five specialized metrics—accuracy, cost-efficiency, optimal selection rate, robustness, and latency—using a principled dataset spanning 9 domains with difficulty stratification. It provides automated end-to-end evaluation pipelines compatible with both open-source and commercial routers via standardized config files, with extensible support for custom router implementations and multiple LLM inference providers (OpenAI, Anthropic, Gemini, etc.).
Stars
71
Forks
12
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 18, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/agents/RouteWorks/RouterArena"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Featured in
Related agents
StonyBrookNLP/appworld
🌍 AppWorld: A Controllable World of Apps and People for Benchmarking Function Calling and...
qualifire-dev/rogue
AI Agent Evaluator & Red Team Platform
future-agi/ai-evaluation
Evaluation Framework for all your AI related Workflows
microsoft/WindowsAgentArena
Windows Agent Arena (WAA) 🪟 is a scalable OS platform for testing and benchmarking of...
agentscope-ai/OpenJudge
OpenJudge: A Unified Framework for Holistic Evaluation and Quality Rewards