Mattbusel/llm-wasm
LLM inference primitives for WebAssembly — cache, retry, routing, guards, cost tracking, templates
Stars
2
Forks
—
Language
Rust
License
MIT
Category
Last pushed
Mar 09, 2026
Monthly downloads
4
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Mattbusel/llm-wasm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
EricLBuehler/mistral.rs
Fast, flexible LLM inference
nerdai/llms-from-scratch-rs
A comprehensive Rust translation of the code from Sebastian Raschka's Build an LLM from Scratch book.
brontoguana/krasis
Krasis is a Hybrid LLM runtime which focuses on efficient running of larger models on consumer...
ShelbyJenkins/llm_utils
llm_utils: Basic LLM tools, best practices, and minimal abstraction.
GoWtEm/llm-model-selector
A high-performance Rust utility that analyzes your system hardware to recommend the optimal LLM...