airaria/TextBrewer

A PyTorch-based knowledge distillation toolkit for natural language processing

48
/ 100
Emerging

Supports multiple distillation techniques including attention-matrix matching, hidden state MSE, and neuron selectivity transfer, with flexible intermediate layer matching and multi-teacher configurations. Built on a modular framework that enables custom loss functions and distillers without requiring model architecture modifications, compatible with Transformer-based architectures across diverse NLP tasks. Integrates with the Hugging Face Transformers library and supports distributed training via DistributedDataParallel and mixed-precision training with Apex.

1,697 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 22 / 25

How are scores calculated?

Stars

1,697

Forks

246

Language

Python

License

Apache-2.0

Last pushed

May 08, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/airaria/TextBrewer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.