spec-coding-mcp and context-optimizer-mcp-server
About spec-coding-mcp
kevinlin/spec-coding-mcp
An MCP server that brings AI spec-driven development workflow to any AI-powered IDE besides Kiro
Implements a five-stage workflow (goals → requirements → design → tasks → execution) using MCP tools that guide AI-assisted development with EARS-format specifications. Generates structured documentation artifacts in `docs/specs/{feature_name}/` containing requirements, design, and task checklists. Available as both a Claude Code Skill (native integration) and MCP server compatible with Claude Desktop, Cursor, and other MCP-enabled IDEs.
About context-optimizer-mcp-server
malaksedarous/context-optimizer-mcp-server
A Model Context Protocol (MCP) server that provides context optimization tools for AI coding assistants including GitHub Copilot, Cursor AI, Claude Desktop, and other MCP-compatible assistants enabling them to extract targeted information rather than processing large terminal outputs and files wasting their context.
Provides LLM-powered tools for intelligent information extraction—file analysis, terminal command execution with output filtering, and web research via Exa.ai—all configurable through environment variables with multi-LLM support (Gemini, Claude, OpenAI). Implements stdio transport with session management for follow-up context on previous executions, plus security controls including path validation and command filtering to prevent context injection.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work