vector-db-benchmark and Vector-Arena
These are competitors offering overlapping benchmarking frameworks for vector databases, with Qdrant's tool being more established (353 stars vs 1) and Vector-Arena attempting to differentiate through multiprocessing isolation and more granular latency metrics (diverse, sequential, filtered, bulk search).
About vector-db-benchmark
qdrant/vector-db-benchmark
Framework for benchmarking vector search engines
Supports benchmarking across multiple vector databases (Qdrant, Weaviate, Milvus, etc.) with pluggable engine implementations and configurable scenarios covering connection, indexing, data upload, and query phases. Uses a distributed server-client architecture with Docker-based engine deployment and Python clients, allowing parameter tuning via JSON configurations and wildcard-based test selection. Integrates datasets automatically via a central registry and produces standardized performance metrics across different hardware setups for comparative analysis.
About Vector-Arena
M4iKZ/Vector-Arena
A comprehensive, multiprocessing-isolated benchmark for evaluating vector database performance and quality. Measures insertion speed, search latency (diverse, sequential, filtered, and bulk), recall accuracy, and memory usage across standard (ChromaDB, LanceDB, Qdrant, FAISS, USearch) and custom engine implementations.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work