qdrant-mcp-server and mcp-server-qdrant
These are ecosystem siblings—the official Qdrant implementation (A) provides the canonical MCP server interface for Qdrant vector databases, while the alternative implementation (B) builds a specialized variant that bundles OpenAI embeddings integration for a more opinionated semantic search workflow.
About qdrant-mcp-server
mhalder/qdrant-mcp-server
MCP server for semantic search using local Qdrant vector database and OpenAI embeddings
Supports multiple embedding providers (Ollama, OpenAI, Cohere, Voyage AI) with privacy-first local processing, plus specialized tools for code vectorization with AST-aware chunking and git history semantic search. Implements hybrid search combining semantic and keyword matching, contextual correlation across code+git repositories, and incremental indexing for efficient updates. Operates as both stdio and HTTP transport for Claude integration, with configurable custom prompts and support for secured Qdrant instances via API keys.
About mcp-server-qdrant
qdrant/mcp-server-qdrant
An official Qdrant Model Context Protocol (MCP) server implementation
Exposes semantic search and storage tools (`qdrant-store` and `qdrant-find`) that allow LLMs to persist and retrieve contextual information with configurable embedding models via FastMCP. Supports both remote Qdrant instances and local databases, with flexible transport protocols (stdio, SSE, HTTP streaming) for integration into Claude Desktop and other MCP clients.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work