Model Inference Serving MLOps Tools

There are 13 model inference serving tools tracked. 1 score above 70 (verified tier). The highest-rated is feast-dev/feast at 97/100 with 6,793 stars and 1,004,994 monthly downloads. 2 of the top 10 are actively maintained.

Get all 13 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=mlops&subcategory=model-inference-serving&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Tool Score Tier
1 feast-dev/feast

The Open Source Feature Store for AI/ML

97
Verified
2 clearml/clearml-serving

ClearML - Model-Serving Orchestration and Repository Solution

63
Established
3 lakehq/sail

LakeSail's computation framework with a mission to unify batch processing,...

60
Established
4 SeldonIO/MLServer

An inference server for your machine learning models, including support for...

57
Established
5 PaddlePaddle/Serving

A flexible, high-performance carrier for machine learning models(『飞桨』服务化部署框架)

54
Established
6 pytorch/serve

Serve, optimize and scale PyTorch models in production

51
Established
7 sustainable-computing-io/kepler-model-server

Model Server for Kepler

46
Emerging
8 raptor-ml/raptor

Transform your pythonic research to an artifact that engineers can deploy easily.

41
Emerging
9 tugraz-isds/systemds

An open source ML system for the end-to-end data science lifecycle

35
Emerging
10 george0st/qgate-model

ML/AI meta-model, used in MLRun/Iguazio/Nuclio, see qgate-sln-

33
Emerging
11 aporia-ai/inferencedb

🚀 Stream inferences of real-time ML models in production to any data lake...

24
Experimental
12 eora-ai/inferoxy

Service for quick deploying and using dockerized Computer Vision models

24
Experimental
13 gasparian/ml-serving-template

Serving large ml models independently and asynchronously via message queue...

20
Experimental