llama.vscode and ollama-autocoder
Both are VS Code extensions that enable local LLM-based code completion, making them direct competitors offering similar functionality through different underlying inference engines (llama.cpp vs Ollama).
Maintenance
13/25
Adoption
10/25
Maturity
16/25
Community
18/25
Maintenance
2/25
Adoption
10/25
Maturity
16/25
Community
20/25
Stars: 1,197
Forks: 107
Downloads: —
Commits (30d): 3
Language: TypeScript
License: MIT
Stars: 146
Forks: 31
Downloads: —
Commits (30d): 0
Language: TypeScript
License: MIT
No Package
No Dependents
Stale 6m
No Package
No Dependents
About llama.vscode
ggml-org/llama.vscode
VS Code extension for LLM-assisted code/text completion
About ollama-autocoder
10Nates/ollama-autocoder
A simple to use Ollama autocompletion engine with options exposed and streaming functionality
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work