cztomsik/ava
All-in-one desktop app for running LLMs locally.
Provides a batteries-included GUI wrapper around llama.cpp with built-in model management and chat capabilities. Built with Zig and C++ for the inference backend, using SQLite for state persistence and Preact with Tailwind CSS for the frontend. Supports headless operation via CLI, enabling both desktop and programmatic access to local LLM inference.
465 stars.
Stars
465
Forks
18
Language
TypeScript
License
—
Category
Last pushed
Jan 23, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/cztomsik/ava"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
sauravpanda/BrowserAI
Run local LLMs like llama, deepseek-distill, kokoro and more inside your browser
cel-ai/celai
Open source framework designed to accelerate the development of omnichannel AI virtual assistants.
lone-cloud/gerbil
A desktop app for running Large Language Models locally.
vinjn/llm-metahuman
An open solution for AI-powered photorealistic digital humans.
snwfdhmp/llm
Use any LLM from the command line.