mrithip/local-llm-desktop
A fully local desktop AI assistant built in C++ with wxWidgets, powered by llama.cpp and running offline.
19
/ 100
Experimental
No Package
No Dependents
Maintenance
10 / 25
Adoption
0 / 25
Maturity
9 / 25
Community
0 / 25
Stars
—
Forks
—
Language
C++
License
MIT
Category
Last pushed
Feb 09, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mrithip/local-llm-desktop"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
cel-ai/celai
Open source framework designed to accelerate the development of omnichannel AI virtual assistants.
64
sauravpanda/BrowserAI
Run local LLMs like llama, deepseek-distill, kokoro and more inside your browser
62
lone-cloud/gerbil
A desktop app for running Large Language Models locally.
48
cztomsik/ava
All-in-one desktop app for running LLMs locally.
40
vinjn/llm-metahuman
An open solution for AI-powered photorealistic digital humans.
40