srikanth235/privy
An open-source alternative to GitHub copilot that runs locally.
Supports both real-time code completion and conversational chat interfaces within VS Code, with pluggable local LLM backends including Ollama, llamafile, and llama.cpp. Configuration allows separate model selection for completion versus chat tasks, enabling users to optimize for latency or quality based on hardware constraints. Integrates exclusively with VS Code via extension marketplace, maintaining data privacy by routing all inference through locally-running language models.
992 stars. No commits in the last 6 months.
Stars
992
Forks
51
Language
TypeScript
License
MIT
Category
Last pushed
May 14, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/srikanth235/privy"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
CNSeniorious000/oai2ollama
OpenAI API -> Ollama API (original idea:...
theblixguy/xcode-copilot-server
GitHub Copilot proxy for Xcode with support for Claude Agent and Codex Agent.
yuchanns/copilot-openai-api
A FastAPI proxy server that seamlessly turns GitHub Copilot's chat completion/embeddings...
hankchiutw/copilot-proxy
A simple HTTP proxy that exposes your GitHub Copilot free quota as an OpenAI-compatible API
shyamsridhar123/ClaudeCode-Copilot-Proxy
Proxy server enabling Claude Code to work seamlessly with GitHub Copilot. Bridges AI coding...