RandomGamingDev/local-on-device-llm-browser-extension-example
An example & tutorial for learning how to get a local LLM running on the user device (no cloud or internet required) in a browser extension for various projects & tasks without the cost (performance, network, environmental, privacy, etc.) via the Google MediaPipe Library. (Perfect for general boilerplate as well.)
Stars
2
Forks
—
Language
JavaScript
License
MIT
Category
Last pushed
Jan 25, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/RandomGamingDev/local-on-device-llm-browser-extension-example"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mlc-ai/web-llm
High-performance In-browser LLM Inference Engine
e2b-dev/desktop
E2B Desktop Sandbox for LLMs. E2B Sandbox with desktop graphical environment that you can...
geekjr/quickai
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art...
chrisrobison/textweb
A text-grid web renderer for AI agents — see the web without screenshots
Azure-Samples/llama-index-javascript
This sample shows how to quickly get started with LlamaIndex.ai on Azure 🚀