abdulrahimzulfiqar/QueryMate
A portable, offline-first CLI AI assistant built with C++ and llama.cpp. Features dynamic file context awareness, memory management, and runs entirely from a USB drive without dependencies.
Stars
3
Forks
—
Language
C++
License
—
Category
Last pushed
Nov 29, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/abdulrahimzulfiqar/QueryMate"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
sauravpanda/BrowserAI
Run local LLMs like llama, deepseek-distill, kokoro and more inside your browser
cel-ai/celai
Open source framework designed to accelerate the development of omnichannel AI virtual assistants.
lone-cloud/gerbil
A desktop app for running Large Language Models locally.
cztomsik/ava
All-in-one desktop app for running LLMs locally.
vinjn/llm-metahuman
An open solution for AI-powered photorealistic digital humans.