SxryxnshS5/onenm_local_llm

onenm_local_llm is a Flutter plugin that simplifies on-device language model inference on Android using llama.cpp. It removes the complexity of setting up native runtimes, model loading, and inference pipelines, so developers can integrate local AI into their apps through a simple API.

38
/ 100
Emerging
No Package No Dependents
Maintenance 13 / 25
Adoption 4 / 25
Maturity 9 / 25
Community 12 / 25

How are scores calculated?

Stars

5

Forks

1

Language

C++

License

MIT

Last pushed

Mar 19, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/SxryxnshS5/onenm_local_llm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.