kubeai and LLMKube
Both are Kubernetes operators for LLM inference, making them direct competitors serving the same use case of deploying and managing ML models on Kubernetes clusters, though KubeAI appears more mature with broader model support (VLMs, embeddings, speech-to-text) compared to LLMKube's focus on GPU-accelerated inference.
Maintenance
13/25
Adoption
10/25
Maturity
16/25
Community
20/25
Maintenance
13/25
Adoption
7/25
Maturity
9/25
Community
12/25
Stars: 1,161
Forks: 125
Downloads: —
Commits (30d): 4
Language: Go
License: Apache-2.0
Stars: 29
Forks: 4
Downloads: —
Commits (30d): 0
Language: Go
License: Apache-2.0
No Package
No Dependents
No Package
No Dependents
About kubeai
kubeai-project/kubeai
AI Inference Operator for Kubernetes. The easiest way to serve ML models in production. Supports VLMs, LLMs, embeddings, and speech-to-text.
About LLMKube
defilantech/LLMKube
Kubernetes operator for GPU-accelerated LLM inference - air-gapped, edge-native, production-ready
Scores updated daily from GitHub, PyPI, and npm data. How scores work