llama.vscode and llama.vim

These are ecosystem siblings—parallel editor integrations for the same underlying llama.cpp inference engine, allowing users to choose between VS Code or Vim based on their preferred editor rather than functional differences.

llama.vscode
57
Established
llama.vim
55
Established
Maintenance 13/25
Adoption 10/25
Maturity 16/25
Community 18/25
Maintenance 13/25
Adoption 10/25
Maturity 16/25
Community 16/25
Stars: 1,197
Forks: 107
Downloads:
Commits (30d): 3
Language: TypeScript
License: MIT
Stars: 1,913
Forks: 95
Downloads:
Commits (30d): 3
Language: Vim Script
License: MIT
No Package No Dependents
No Package No Dependents

About llama.vscode

ggml-org/llama.vscode

VS Code extension for LLM-assisted code/text completion

About llama.vim

ggml-org/llama.vim

Vim plugin for LLM-assisted code/text completion

Supports both fill-in-middle completions with real-time auto-suggestions and instruction-based editing for refactoring tasks. Communicates with a local llama.cpp server instance, enabling efficient inference on consumer hardware through smart context reuse across multiple file buffers. Includes configurable keybindings, performance metrics, and compatibility with FIM-specialized models like Qwen2.5-Coder across various parameter sizes.

Scores updated daily from GitHub, PyPI, and npm data. How scores work