hannes-sistemica/browser-llm-webgpu
Proof of concept for a reasoning model that runs locally in your browser with WebGPU acceleration
No commits in the last 6 months.
Stars
18
Forks
4
Language
HTML
License
—
Category
Last pushed
Jan 22, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/hannes-sistemica/browser-llm-webgpu"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mlc-ai/web-llm
High-performance In-browser LLM Inference Engine
e2b-dev/desktop
E2B Desktop Sandbox for LLMs. E2B Sandbox with desktop graphical environment that you can...
geekjr/quickai
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art...
Azure-Samples/llama-index-javascript
This sample shows how to quickly get started with LlamaIndex.ai on Azure 🚀
ParisNeo/lollms
An all in one AI solution compatible with any known AI service on the planet