Knowledge Distillation Frameworks
Libraries and implementations for knowledge distillation techniques that compress neural networks by transferring knowledge from teacher to student models. Does NOT include general model compression, pruning, quantization, or dataset distillation approaches.
There are 32 knowledge distillation frameworks tracked. 2 score above 50 (established tier). The highest-rated is Guang000/Awesome-Dataset-Distillation at 63/100 with 1,909 stars. 2 of the top 10 are actively maintained.
Get all 32 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=ml-frameworks&subcategory=knowledge-distillation-frameworks&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Framework | Score | Tier |
|---|---|---|---|
| 1 |
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications. |
|
Established |
| 2 |
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation |
|
Established |
| 3 |
SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending... |
|
Emerging |
| 4 |
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with... |
|
Emerging |
| 5 |
FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。 |
|
Emerging |
| 6 |
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023) |
|
Emerging |
| 7 |
szq0214/FKD
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework... |
|
Emerging |
| 8 |
yzd-v/FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022) |
|
Emerging |
| 9 |
decile-team/distil
DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active... |
|
Emerging |
| 10 |
megvii-research/mdistiller
The official implementation of [CVPR2022] Decoupled Knowledge Distillation... |
|
Emerging |
| 11 |
LutingWang/awesome-knowledge-distillation-for-object-detection
A curated list of awesome knowledge distillation papers and codes for object... |
|
Emerging |
| 12 |
VITA-Group/SymbolicPCC
📜 [NeurIPS 2022] "Symbolic Distillation for Learned TCP Congestion Control",... |
|
Emerging |
| 13 |
circleLZY/MTKD-CD
Official implementation for "JL1-CD: A New Benchmark for Remote Sensing... |
|
Emerging |
| 14 |
BatsResearch/csp
Learning to compose soft prompts for compositional zero-shot learning. |
|
Experimental |
| 15 |
NVlabs/DIODE
Official PyTorch implementation of Data-free Knowledge Distillation for... |
|
Experimental |
| 16 |
DefangChen/SemCKD
[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation... |
|
Experimental |
| 17 |
DefangChen/SimKD
[CVPR-2022] Official implementation for "Knowledge Distillation with the... |
|
Experimental |
| 18 |
ZuchniakK/MTKD
Multi-Teacher Knowledge Distillation, code for my PhD dissertation. I used... |
|
Experimental |
| 19 |
Adamdad/KnowledgeFactor
[ECCV2022] Factorizing Knowledge in Neural Networks |
|
Experimental |
| 20 |
wjun0830/Difficulty-Aware-Simulator
Official PyTorch Repository of "Difficulty-Aware Simulator for Open Set... |
|
Experimental |
| 21 |
ViTAE-Transformer/SimDistill
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal... |
|
Experimental |
| 22 |
twinkle0331/LGTM
[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning... |
|
Experimental |
| 23 |
IPL-sharif/KD_Survey
A Comprehensive Survey on Knowledge Distillation |
|
Experimental |
| 24 |
Smooth-humvee686/onpolicydistillation
🛠️ Apply on-policy distillation to enhance Qwen3-0.6b's performance on GSM8K... |
|
Experimental |
| 25 |
lyxiang-casia/EKD
Official Implementation of Evidential Knowledge Distillation (ICCV 2025) |
|
Experimental |
| 26 |
juyongjiang/KaSA
[ICLR'25] Code for KaSA, an official implementation of "KaSA:... |
|
Experimental |
| 27 |
ismail31416/LumiNet
The official (TMLR) implementation of LumiNet: Perception-Driven Knowledge... |
|
Experimental |
| 28 |
King-Rafat/STKD_CFMitigation
Mitigating carbon footprint for knowledge distillation based deep learning... |
|
Experimental |
| 29 |
adrianrm99/separating_knowledge
[ICML 2025] Separating Knowledge with Procedural Data |
|
Experimental |
| 30 |
AsafShul/PoDD
Official PyTorch Implementation for the "Distilling Datasets Into Less Than... |
|
Experimental |
| 31 |
mashijie1028/TrustDD
(Pattern Recognition 2025) Towards Trustworthy Dataset Distillation |
|
Experimental |
| 32 |
nphdang/FS-BBT
Black-box Few-shot Knowledge Distillation |
|
Experimental |