aporia-ai/inferencedb
🚀 Stream inferences of real-time ML models in production to any data lake (Experimental)
No commits in the last 6 months.
Stars
81
Forks
3
Language
Python
License
—
Category
Last pushed
Jun 10, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/aporia-ai/inferencedb"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
feast-dev/feast
The Open Source Feature Store for AI/ML
clearml/clearml-serving
ClearML - Model-Serving Orchestration and Repository Solution
lakehq/sail
LakeSail's computation framework with a mission to unify batch processing, stream processing,...
SeldonIO/MLServer
An inference server for your machine learning models, including support for multiple frameworks,...
PaddlePaddle/Serving
A flexible, high-performance carrier for machine learning models(『飞桨』服务化部署框架)