spec-coding-mcp and context-optimizer-mcp-server

spec-coding-mcp
54
Established
Maintenance 10/25
Adoption 11/25
Maturity 16/25
Community 17/25
Maintenance 2/25
Adoption 8/25
Maturity 24/25
Community 12/25
Stars: 19
Forks: 8
Downloads: 99
Commits (30d): 0
Language: TypeScript
License:
Stars: 53
Forks: 6
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No License
Stale 6m

About spec-coding-mcp

kevinlin/spec-coding-mcp

An MCP server that brings AI spec-driven development workflow to any AI-powered IDE besides Kiro

Implements a five-stage workflow (goals → requirements → design → tasks → execution) using MCP tools that guide AI-assisted development with EARS-format specifications. Generates structured documentation artifacts in `docs/specs/{feature_name}/` containing requirements, design, and task checklists. Available as both a Claude Code Skill (native integration) and MCP server compatible with Claude Desktop, Cursor, and other MCP-enabled IDEs.

About context-optimizer-mcp-server

malaksedarous/context-optimizer-mcp-server

A Model Context Protocol (MCP) server that provides context optimization tools for AI coding assistants including GitHub Copilot, Cursor AI, Claude Desktop, and other MCP-compatible assistants enabling them to extract targeted information rather than processing large terminal outputs and files wasting their context.

Provides LLM-powered tools for intelligent information extraction—file analysis, terminal command execution with output filtering, and web research via Exa.ai—all configurable through environment variables with multi-LLM support (Gemini, Claude, OpenAI). Implements stdio transport with session management for follow-up context on previous executions, plus security controls including path validation and command filtering to prevent context injection.

Scores updated daily from GitHub, PyPI, and npm data. How scores work