thiagosilvahyper/bihe-quantization
BIHE Protocol - Next-generation vector quantization combining E8 lattice geometry with Lloyd algorithm optimization. Achieves 16× compression ratio while maintaining 88.5% recall on real-world datasets. Validated on Stanford SQuAD v2.0 (5,000 texts) and Microsoft MS MARCO (6,000 texts). Features NSM below Shannon theoretical limits.
Stars
2
Forks
—
Language
Python
License
MIT
Category
Last pushed
Oct 27, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/thiagosilvahyper/bihe-quantization"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
FlagOpen/FlagEmbedding
Retrieval and Retrieval-augmented LLMs
Blaizzy/mlx-embeddings
MLX-Embeddings is the best package for running Vision and Language Embedding models locally on...
qdrant/fastembed
Fast, Accurate, Lightweight Python library to make State of the Art Embedding
Merck/Sapiens
Sapiens is a human antibody language model based on BERT.
amansrivastava17/embedding-as-service
One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques