SeanLee97/AnglE
Train and Infer Powerful Sentence Embeddings with AnglE | 🔥 SOTA on STS and MTEB Leaderboard
Implements angle-optimized loss functions (AnglE, Espresso, CoSENT, contrastive) for training embeddings across BERT and LLM backbones, including bidirectional LLM variants. Supports both single and multi-GPU training with task-specific prompt engineering for retrieval versus similarity scenarios. Integrates with Hugging Face transformers and provides LoRA-based fine-tuning for efficient adaptation of large language models.
568 stars.
Stars
568
Forks
38
Language
Python
License
MIT
Category
Last pushed
Oct 19, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/SeanLee97/AnglE"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Featured in
Higher-rated alternatives
embeddings-benchmark/mteb
MTEB: Massive Text Embedding Benchmark
yannvgn/laserembeddings
LASER multilingual sentence embeddings as a pip package
harmonydata/harmony
The Harmony Python library: a research tool for psychologists to harmonise data and...
embeddings-benchmark/results
Data for the MTEB leaderboard
fresh-stack/freshstack
This repository helps you evaluate your models on the FreshStack benchmark!