Llm Inference Serving MLOps Tools

There are 6 llm inference serving tools tracked. 2 score above 70 (verified tier). The highest-rated is bentoml/BentoML at 89/100 with 8,516 stars and 168,316 monthly downloads. 3 of the top 10 are actively maintained.

Get all 6 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=mlops&subcategory=llm-inference-serving&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Tool Score Tier
1 bentoml/BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs,...

89
Verified
2 nndeploy/nndeploy

一款简单易用和高性能的AI部署框架 | An Easy-to-Use and High-Performance AI Deployment Framework

75
Verified
3 kubeflow/trainer

Distributed AI Model Training and LLM Fine-Tuning on Kubernetes

67
Established
4 cncf/llm-in-action

🤖 Discover how to apply your LLM app skills on Kubernetes!

42
Emerging
5 ray-project/llms-in-prod-workshop-2023

Deploy and Scale LLM-based applications

26
Experimental
6 SohamGovande/podplex

🦾💻🌐 distributed training & serverless inference at scale on RunPod

19
Experimental