axonhub and opencode
One is an open-source AI gateway for managing LLM calls with failover and load balancing, while the other is an AI coding agent built for the terminal, suggesting they are **complements** where the gateway could manage the LLM calls made by the coding agent.
About axonhub
looplj/axonhub
⚡️ Open-source AI Gateway — Use any SDK to call 100+ LLMs. Built-in failover, load balancing, cost control & end-to-end tracing.
Supports transparent SDK translation across 10+ LLM providers through a unified proxy architecture, enabling applications built with OpenAI SDK to seamlessly call Claude, Gemini, or other models. Features thread-aware request tracing for complete observability, fine-grained RBAC with usage quotas, and per-token cost breakdown across input, output, and cache operations. Built in Go with Docker support and includes a web dashboard for channel management, model pricing configuration, and real-time request monitoring.
About opencode
opencode-ai/opencode
A powerful AI coding agent. Built for the terminal.
Built with Go and Bubble Tea, it integrates Language Server Protocol for code intelligence and supports tool-use capabilities (command execution, file search, code modification). Connects to multiple AI providers including OpenAI, Anthropic, Google Gemini, AWS Bedrock, Groq, and Azure OpenAI, with session persistence via SQLite and configurable auto-compaction to manage context window limits.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work