qdrant-mcp-server and mcp-server-qdrant

These are ecosystem siblings—the official Qdrant implementation (A) provides the canonical MCP server interface for Qdrant vector databases, while the alternative implementation (B) builds a specialized variant that bundles OpenAI embeddings integration for a more opinionated semantic search workflow.

qdrant-mcp-server
59
Established
mcp-server-qdrant
56
Established
Maintenance 10/25
Adoption 13/25
Maturity 18/25
Community 18/25
Maintenance 13/25
Adoption 10/25
Maturity 9/25
Community 24/25
Stars: 21
Forks: 13
Downloads: 790
Commits (30d): 0
Language: TypeScript
License: MIT
Stars: 1,270
Forks: 239
Downloads:
Commits (30d): 1
Language: Python
License: Apache-2.0
No risk flags
No Package No Dependents

About qdrant-mcp-server

mhalder/qdrant-mcp-server

MCP server for semantic search using local Qdrant vector database and OpenAI embeddings

Supports multiple embedding providers (Ollama, OpenAI, Cohere, Voyage AI) with privacy-first local processing, plus specialized tools for code vectorization with AST-aware chunking and git history semantic search. Implements hybrid search combining semantic and keyword matching, contextual correlation across code+git repositories, and incremental indexing for efficient updates. Operates as both stdio and HTTP transport for Claude integration, with configurable custom prompts and support for secured Qdrant instances via API keys.

About mcp-server-qdrant

qdrant/mcp-server-qdrant

An official Qdrant Model Context Protocol (MCP) server implementation

Exposes semantic search and storage tools (`qdrant-store` and `qdrant-find`) that allow LLMs to persist and retrieve contextual information with configurable embedding models via FastMCP. Supports both remote Qdrant instances and local databases, with flexible transport protocols (stdio, SSE, HTTP streaming) for integration into Claude Desktop and other MCP clients.

Scores updated daily from GitHub, PyPI, and npm data. How scores work