cncf/llm-starter-pack
🤖 Get started with LLMs on your kind cluster, today!
This project helps developers quickly set up and experiment with large language models (LLMs) within a Kubernetes environment on their local machine. It takes your existing Docker and Kubernetes tools as input, and outputs a running LLM chatbot demo accessible in your browser. This is designed for developers who want to test LLMs in a cloud-native setting without complex infrastructure setup.
172 stars.
Use this if you are a developer looking to rapidly deploy and interact with an LLM in a local Kubernetes cluster.
Not ideal if you are a non-developer seeking an off-the-shelf LLM application or a production-ready deployment.
Stars
172
Forks
23
Language
Python
License
—
Category
Last pushed
Mar 09, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/cncf/llm-starter-pack"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
kserve/kserve
Standardized Distributed Generative and Predictive AI Inference Platform for Scalable,...
omegaml/omegaml
MLOps simplified. One-stop AI delivery platform, all the features you need.
awslabs/aiops-modules
AIOps modules is a collection of reusable Infrastructure as Code (IaC) modules for Machine...
GoogleCloudDataproc/dataproc-ml-python
Library to simplify running distributed ML workloads with Apache Spark
jina-ai/serve
☁️ Build multimodal AI applications with cloud-native stack