qdrant-mcp-server and vector-mcp
These are competitors offering overlapping vector database integration capabilities, though the second provides broader database support (ChromaDB, Couchbase, MongoDB, Qdrant, PGVector) compared to the first's Qdrant-specific implementation.
About qdrant-mcp-server
mhalder/qdrant-mcp-server
MCP server for semantic search using local Qdrant vector database and OpenAI embeddings
Supports multiple embedding providers (Ollama, OpenAI, Cohere, Voyage AI) with privacy-first local processing, plus specialized tools for code vectorization with AST-aware chunking and git history semantic search. Implements hybrid search combining semantic and keyword matching, contextual correlation across code+git repositories, and incremental indexing for efficient updates. Operates as both stdio and HTTP transport for Claude integration, with configurable custom prompts and support for secured Qdrant instances via API keys.
About vector-mcp
Knuckles-Team/vector-mcp
Vector MCP Server for AI Agents - Supports ChromaDB, Couchbase, MongoDB, Qdrant, and PGVector
Implements standardized MCP tool interfaces for hybrid lexical-vector search, collection lifecycle management, and RAG workflows across heterogeneous vector databases through a stdio-based transport layer. Built on FastAPI with pluggable database adapters, enabling agents to perform semantic retrieval and document ingestion without database-specific logic. Integrates with Claude, other AI frameworks via the Model Context Protocol, and supports multiple authentication schemes (JWT, OAuth2, OIDC) for production deployment.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work