dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
A curated collection of knowledge distillation research spanning foundational ensemble methods through modern techniques like attention transfer, dark knowledge, and data-free distillation. The resource catalogs papers across diverse applications—from model compression and adversarial robustness to sequence-level learning and object detection—covering architectural approaches including FitNets, privileged information transfer, and mutual learning schemes. Organized by methodology and application domain, it serves as a comprehensive reference for techniques to transfer knowledge from large teacher models to efficient student networks across vision, NLP, and speech domains.
3,825 stars. Actively maintained with 1 commit in the last 30 days.
Stars
3,825
Forks
513
Language
—
License
Apache-2.0
Category
Last pushed
Dec 25, 2025
Commits (30d)
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/dkozlov/awesome-knowledge-distillation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related frameworks
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of...
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
yzd-v/FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)