vector-db-benchmark and Vector-Arena

These are competitors offering overlapping benchmarking frameworks for vector databases, with Qdrant's tool being more established (353 stars vs 1) and Vector-Arena attempting to differentiate through multiprocessing isolation and more granular latency metrics (diverse, sequential, filtered, bulk search).

vector-db-benchmark
54
Established
Vector-Arena
23
Experimental
Maintenance 10/25
Adoption 10/25
Maturity 9/25
Community 25/25
Maintenance 13/25
Adoption 1/25
Maturity 9/25
Community 0/25
Stars: 353
Forks: 139
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 1
Forks:
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No Package No Dependents

About vector-db-benchmark

qdrant/vector-db-benchmark

Framework for benchmarking vector search engines

Supports benchmarking across multiple vector databases (Qdrant, Weaviate, Milvus, etc.) with pluggable engine implementations and configurable scenarios covering connection, indexing, data upload, and query phases. Uses a distributed server-client architecture with Docker-based engine deployment and Python clients, allowing parameter tuning via JSON configurations and wildcard-based test selection. Integrates datasets automatically via a central registry and produces standardized performance metrics across different hardware setups for comparative analysis.

About Vector-Arena

M4iKZ/Vector-Arena

A comprehensive, multiprocessing-isolated benchmark for evaluating vector database performance and quality. Measures insertion speed, search latency (diverse, sequential, filtered, and bulk), recall accuracy, and memory usage across standard (ChromaDB, LanceDB, Qdrant, FAISS, USearch) and custom engine implementations.

Scores updated daily from GitHub, PyPI, and npm data. How scores work