axonhub and opencode

One is an open-source AI gateway for managing LLM calls with failover and load balancing, while the other is an AI coding agent built for the terminal, suggesting they are **complements** where the gateway could manage the LLM calls made by the coding agent.

axonhub
70
Verified
opencode
46
Emerging
Maintenance 25/25
Adoption 10/25
Maturity 15/25
Community 20/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 20/25
Stars: 2,409
Forks: 261
Downloads:
Commits (30d): 130
Language: Go
License:
Stars: 11,374
Forks: 1,107
Downloads:
Commits (30d): 0
Language: Go
License: MIT
No Package No Dependents
Archived Stale 6m No Package No Dependents

About axonhub

looplj/axonhub

⚡️ Open-source AI Gateway — Use any SDK to call 100+ LLMs. Built-in failover, load balancing, cost control & end-to-end tracing.

Supports transparent SDK translation across 10+ LLM providers through a unified proxy architecture, enabling applications built with OpenAI SDK to seamlessly call Claude, Gemini, or other models. Features thread-aware request tracing for complete observability, fine-grained RBAC with usage quotas, and per-token cost breakdown across input, output, and cache operations. Built in Go with Docker support and includes a web dashboard for channel management, model pricing configuration, and real-time request monitoring.

About opencode

opencode-ai/opencode

A powerful AI coding agent. Built for the terminal.

Built with Go and Bubble Tea, it integrates Language Server Protocol for code intelligence and supports tool-use capabilities (command execution, file search, code modification). Connects to multiple AI providers including OpenAI, Anthropic, Google Gemini, AWS Bedrock, Groq, and Azure OpenAI, with session persistence via SQLite and configurable auto-compaction to manage context window limits.

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work