isaacus-dev/mleb
The code used to evaluate embedding models on the Massive Legal Embedding Benchmark (MLEB).
41
/ 100
Emerging
No Package
No Dependents
Maintenance
10 / 25
Adoption
7 / 25
Maturity
13 / 25
Community
11 / 25
Stars
32
Forks
4
Language
Python
License
MIT
Category
Last pushed
Feb 24, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/isaacus-dev/mleb"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Featured in
Compare
Higher-rated alternatives
embeddings-benchmark/mteb
MTEB: Massive Text Embedding Benchmark
99
yannvgn/laserembeddings
LASER multilingual sentence embeddings as a pip package
60
harmonydata/harmony
The Harmony Python library: a research tool for psychologists to harmonise data and...
59
embeddings-benchmark/results
Data for the MTEB leaderboard
54
fresh-stack/freshstack
This repository helps you evaluate your models on the FreshStack benchmark!
50