context-optimizer-mcp-server and copilot-mcp-tool
These are complementary tools: context-optimizer prepares and filters code context for AI assistants, while copilot-mcp-tool expands the analysis capability by delegating to GitHub Copilot's larger token window, so they could be used together to optimize what gets sent to Copilot for processing.
About context-optimizer-mcp-server
malaksedarous/context-optimizer-mcp-server
A Model Context Protocol (MCP) server that provides context optimization tools for AI coding assistants including GitHub Copilot, Cursor AI, Claude Desktop, and other MCP-compatible assistants enabling them to extract targeted information rather than processing large terminal outputs and files wasting their context.
Provides LLM-powered tools for intelligent information extraction—file analysis, terminal command execution with output filtering, and web research via Exa.ai—all configurable through environment variables with multi-LLM support (Gemini, Claude, OpenAI). Implements stdio transport with session management for follow-up context on previous executions, plus security controls including path validation and command filtering to prevent context injection.
About copilot-mcp-tool
Poorgramer-Zack/copilot-mcp-tool
MCP server that enables AI assistants to interact with Github Copilot cli, leveraging copilot massive token window for large file analysis and codebase understanding
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work