kubeai-project/kubeai
AI Inference Operator for Kubernetes. The easiest way to serve ML models in production. Supports VLMs, LLMs, embeddings, and speech-to-text.
59
/ 100
Established
1,161 stars. Actively maintained with 4 commits in the last 30 days.
No Package
No Dependents
Maintenance
13 / 25
Adoption
10 / 25
Maturity
16 / 25
Community
20 / 25
Stars
1,161
Forks
125
Language
Go
License
Apache-2.0
Category
Last pushed
Feb 23, 2026
Commits (30d)
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/kubeai-project/kubeai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
kubeflow/katib
Automated Machine Learning on Kubernetes
67
beam-cloud/beta9
Ultrafast serverless GPU inference, sandboxes, and background jobs
65
sgl-project/rbg
A workload for deploying LLM inference services on Kubernetes
60
scitix/arks
Arks is a cloud-native inference framework running on Kubernetes
46
star-whale/starwhale
an MLOps/LLMOps platform
45