huggingface/setfit
Efficient few-shot learning with Sentence Transformers
Combines pretrained Sentence Transformers with lightweight classification heads (scikit-learn or PyTorch-based) to eliminate the need for prompts or verbalizers in few-shot scenarios. Integrates with Hugging Face Hub for model management and supports multilingual classification through any Sentence Transformer checkpoint. Training uses contrastive learning on minimal labeled examples followed by head fine-tuning, achieving competitive accuracy orders of magnitude faster than large language models.
2,699 stars and 212,684 monthly downloads. Used by 3 other packages. Available on PyPI.
Stars
2,699
Forks
255
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Dec 11, 2025
Monthly downloads
212,684
Commits (30d)
0
Dependencies
7
Reverse dependents
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/huggingface/setfit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
YujiaBao/Distributional-Signatures
"Few-shot Text Classification with Distributional Signatures" ICLR 2020
ELM-Research/ECG-Language-Models
A Training and Evaluation Framework for ECG-Language Models (ELMs)
zhongyuchen/few-shot-text-classification
Few-shot binary text classification with Induction Networks and Word2Vec weights initialization
StefanHeng/ECG-Representation-Learning
Self-supervised pre-training for ECG representation with inspiration from transformers & computer vision
cloudera/CML_AMP_Few-Shot_Text_Classification
Perform topic classification on news articles in several limited-labeled data regimes.