hilum-labs/local-llm-rn
React Native SDK for local LLM inference and on-device AI on iOS and Android.
23
/ 100
Experimental
No Package
No Dependents
Maintenance
13 / 25
Adoption
1 / 25
Maturity
9 / 25
Community
0 / 25
Stars
1
Forks
—
Language
TypeScript
License
MIT
Category
Last pushed
Mar 14, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/hilum-labs/local-llm-rn"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mlc-ai/web-llm
High-performance In-browser LLM Inference Engine
86
e2b-dev/desktop
E2B Desktop Sandbox for LLMs. E2B Sandbox with desktop graphical environment that you can...
68
geekjr/quickai
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art...
63
Azure-Samples/llama-index-javascript
This sample shows how to quickly get started with LlamaIndex.ai on Azure 🚀
56
ParisNeo/lollms
An all in one AI solution compatible with any known AI service on the planet
54