dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

57
/ 100
Established

A curated collection of knowledge distillation research spanning foundational ensemble methods through modern techniques like attention transfer, dark knowledge, and data-free distillation. The resource catalogs papers across diverse applications—from model compression and adversarial robustness to sequence-level learning and object detection—covering architectural approaches including FitNets, privileged information transfer, and mutual learning schemes. Organized by methodology and application domain, it serves as a comprehensive reference for techniques to transfer knowledge from large teacher models to efficient student networks across vision, NLP, and speech domains.

3,825 stars. Actively maintained with 1 commit in the last 30 days.

No Package No Dependents
Maintenance 9 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 22 / 25

How are scores calculated?

Stars

3,825

Forks

513

Language

License

Apache-2.0

Last pushed

Dec 25, 2025

Commits (30d)

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/dkozlov/awesome-knowledge-distillation"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.