jeffasante/cellm
Mobile-native LLM serving engine research in Rust. Paged KV cache, multi-session scheduling, and Metal/Vulkan kernels for on-device inference under 512MB RAM.
37
/ 100
Emerging
No Package
No Dependents
Maintenance
13 / 25
Adoption
3 / 25
Maturity
9 / 25
Community
12 / 25
Stars
3
Forks
1
Language
Rust
License
Apache-2.0
Last pushed
Apr 06, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/jeffasante/cellm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
openai/openai-python
The official Python library for the OpenAI API
96
pydantic/pydantic
Data validation using Python type hints
93
campfirein/byterover-cli
ByteRover CLI (brv) - The portable memory layer for autonomous coding agents (formerly Cipher)
88
mistralai/client-python
Python client library for Mistral AI platform
88
cohere-ai/cohere-python
Python Library for Accessing the Cohere API
86