johnbean393/Sidekick
A native macOS app that allows users to chat with a local LLM that can respond with information from files, folders and websites on your Mac without installing any other software. Powered by llama.cpp.
Implements RAG (Retrieval Augmented Generation) with modular "experts" that organize files, folders, and web sources for contextual responses with source citations. Features function calling for agent-based tasks, deep research workflows spanning 50-80 webpages, and local tool execution (contacts, email integration). Supports both embedded GGUF models via llama.cpp and OpenAI-compatible APIs, with automatic image generation, Canvas for live editing, and persistent memory across conversations.
3,184 stars. Actively maintained with 8 commits in the last 30 days.
Stars
3,184
Forks
142
Language
Swift
License
MIT
Category
Last pushed
Mar 12, 2026
Commits (30d)
8
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/johnbean393/Sidekick"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
dappros/ethora
Open-source engine for chat 💬, AI assistants 🤖 & wallets 🪪. React, Typescript, Python, XMPP....
kamjin3086/chatless
💻一款简洁实用轻量级的本地AI对话客户端,采用Tauri2.0和Next.js编写 A simple, practical, and lightweight local AI chat...
abraxas914/VESTI
Local-first AI conversation memory hub to capture, search, summarize, and export chats across...
isaccanedo/lobe-chat
🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers(...
diandiancha/LittleAIBox
A privacy-focused AI chat platform built with Vite + Capacitor + Cloudflare. Runs locally or in...