octocode-mcp and src-to-kb
These are **complements**: one provides real-time semantic search across repositories via MCP protocol, while the other performs static code-to-knowledge-base conversion—together they address both dynamic querying and batch indexing needs for LLM code understanding.
About octocode-mcp
bgauryy/octocode-mcp
MCP server for semantic code research and context generation on real-time using LLM patterns | Search naturally across public & private repos based on your permissions | Transform any accessible codebase/s into AI-optimized knowledge on simple and complex flows | Find real implementations and live docs from anywhere
Implements MCP (Model Context Protocol) with LSP-powered code intelligence (Go to Definition, Find References, Call Hierarchy) across GitHub, GitLab, and local codebases, enabling compiler-level understanding without parsing. Provides modular Agent Skills—including multi-phase research with session persistence, AST-driven code analysis, dependency graphing, and PR review across seven domains—composable via CLI or direct integration into Claude/Cursor.
About src-to-kb
vezlo/src-to-kb
Convert source code to LLM ready knowledge base
Implements intelligent code chunking with configurable overlap and optional OpenAI embeddings for semantic search, while exposing a REST API with Swagger docs and MCP (Model Context Protocol) server for native Claude/Cursor IDE integration. Supports multiple languages and sources—codebases, Notion databases—with three answer modes (End User/Developer/Copilot) and optional external server offloading via REST for distributed knowledge base processing.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work