pushinspektor856/GitHub-Copilot-Free

An ultra-fast C++ daemon proxy that replaces the official GitHub Copilot endpoint, allowing you to use completely free local or open-source LLMs inside VS Code and JetBrains. Unlock advanced AI code completion with zero subscription fees and sub-millisecond latency.

31
/ 100
Emerging

Intercepts Copilot extension requests via asynchronous C++20 proxy, reformats AST syntax context on-the-fly, and routes to OpenAI-compatible APIs or local LLMs (Ollama, DeepSeek Coder, Llama-3), while stripping telemetry payloads. Works transparently with VS Code, JetBrains IDEs, and WebStorm through spoofed authentication tokens. Achieves sub-0.5ms latency with minimal 4MB footprint by handling token substitution synchronously at the network protocol layer.

No Package No Dependents
Maintenance 13 / 25
Adoption 9 / 25
Maturity 9 / 25
Community 0 / 25

How are scores calculated?

Stars

75

Forks

Language

C++

License

MIT

Last pushed

Mar 21, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/pushinspektor856/GitHub-Copilot-Free"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.