johnbean393/Sidekick

A native macOS app that allows users to chat with a local LLM that can respond with information from files, folders and websites on your Mac without installing any other software. Powered by llama.cpp.

63
/ 100
Established

Implements RAG (Retrieval Augmented Generation) with modular "experts" that organize files, folders, and web sources for contextual responses with source citations. Features function calling for agent-based tasks, deep research workflows spanning 50-80 webpages, and local tool execution (contacts, email integration). Supports both embedded GGUF models via llama.cpp and OpenAI-compatible APIs, with automatic image generation, Canvas for live editing, and persistent memory across conversations.

3,184 stars. Actively maintained with 8 commits in the last 30 days.

No Package No Dependents
Maintenance 20 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

3,184

Forks

142

Language

Swift

License

MIT

Last pushed

Mar 12, 2026

Commits (30d)

8

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/johnbean393/Sidekick"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.