mcp-language-server and vscode-mcp

These are complements that serve different layers of the same integration stack: mcp-language-server exposes LSP capabilities as MCP tools for any MCP client, while vscode-mcp implements an MCP server specifically for VSCode editors to expose their native editor APIs to AI agents, allowing them to work together in the same workflow.

mcp-language-server
54
Established
vscode-mcp
54
Established
Maintenance 10/25
Adoption 10/25
Maturity 16/25
Community 18/25
Maintenance 13/25
Adoption 8/25
Maturity 15/25
Community 18/25
Stars: 1,478
Forks: 119
Downloads:
Commits (30d): 0
Language: Go
License: BSD-3-Clause
Stars: 58
Forks: 14
Downloads:
Commits (30d): 0
Language: TypeScript
License:
No Package No Dependents
No Package No Dependents

About mcp-language-server

isaacphi/mcp-language-server

mcp-language-server gives MCP enabled clients access semantic tools like get definition, references, rename, and diagnostics.

Built on the Model Context Protocol, it wraps any stdio-based language server (gopls, rust-analyzer, pyright, clangd, etc.) to expose LSP capabilities as MCP tools for LLM clients. Beyond standard definition and references, it includes file editing via line-based text operations and hover documentation retrieval. Supports multi-language codebases through pluggable language server backends with configurable workspace paths and environment variables.

About vscode-mcp

tjx666/vscode-mcp

MCP server for Claude Code/VSCode/Cursor/Windsurf to use editor self functionality. ⚡ Get real-time LSP diagnostics, type information, and code navigation for AI coding agents without waiting for slow tsc/eslint checks.

Exposes eight MCP tools including symbol renaming with automatic import updates, comprehensive LSP information aggregation (definition, hover, signatures), and safe file operations—all backed by VSCode's language servers rather than subprocess execution. Operates as a monorepo with a VSCode extension bridge that maintains per-workspace socket connections to route MCP protocol requests into real-time LSP data, eliminating delays from external type-checking and linting commands. Supports Cursor, Claude Code, Windsurf, and Gemini CLI with granular tool filtering via environment variables or command-line arguments.

Scores updated daily from GitHub, PyPI, and npm data. How scores work