ahmedmagood/cpu-slm
🖥️ Explore CPU-SLM, a Rust-based SLM/LLM project that runs on CPU, offering efficient inference and chat with minimal dependencies.
Stars
—
Forks
—
Language
Rust
License
GPL-3.0
Category
Last pushed
Mar 13, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ahmedmagood/cpu-slm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
bentoml/OpenLLM
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...