Knowledge Distillation Frameworks

Libraries and implementations for knowledge distillation techniques that compress neural networks by transferring knowledge from teacher to student models. Does NOT include general model compression, pruning, quantization, or dataset distillation approaches.

There are 32 knowledge distillation frameworks tracked. 2 score above 50 (established tier). The highest-rated is Guang000/Awesome-Dataset-Distillation at 63/100 with 1,909 stars. 2 of the top 10 are actively maintained.

Get all 32 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=ml-frameworks&subcategory=knowledge-distillation-frameworks&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Framework Score Tier
1 Guang000/Awesome-Dataset-Distillation

A curated list of awesome papers on dataset distillation and related applications.

63
Established
2 dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

57
Established
3 SforAiDl/KD_Lib

A Pytorch Knowledge Distillation library for benchmarking and extending...

49
Emerging
4 SakurajimaMaiii/ProtoKD

[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with...

41
Emerging
5 FLHonker/Awesome-Knowledge-Distillation

Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。

39
Emerging
6 HikariTJU/LD

Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)

38
Emerging
7 szq0214/FKD

Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework...

37
Emerging
8 yzd-v/FGD

Focal and Global Knowledge Distillation for Detectors (CVPR 2022)

37
Emerging
9 decile-team/distil

DISTIL: Deep dIverSified inTeractIve Learning. An active/inter-active...

37
Emerging
10 megvii-research/mdistiller

The official implementation of [CVPR2022] Decoupled Knowledge Distillation...

33
Emerging
11 LutingWang/awesome-knowledge-distillation-for-object-detection

A curated list of awesome knowledge distillation papers and codes for object...

31
Emerging
12 VITA-Group/SymbolicPCC

📜 [NeurIPS 2022] "Symbolic Distillation for Learned TCP Congestion Control",...

30
Emerging
13 circleLZY/MTKD-CD

Official implementation for "JL1-CD: A New Benchmark for Remote Sensing...

30
Emerging
14 BatsResearch/csp

Learning to compose soft prompts for compositional zero-shot learning.

29
Experimental
15 NVlabs/DIODE

Official PyTorch implementation of Data-free Knowledge Distillation for...

29
Experimental
16 DefangChen/SemCKD

[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation...

28
Experimental
17 DefangChen/SimKD

[CVPR-2022] Official implementation for "Knowledge Distillation with the...

28
Experimental
18 ZuchniakK/MTKD

Multi-Teacher Knowledge Distillation, code for my PhD dissertation. I used...

26
Experimental
19 Adamdad/KnowledgeFactor

[ECCV2022] Factorizing Knowledge in Neural Networks

26
Experimental
20 wjun0830/Difficulty-Aware-Simulator

Official PyTorch Repository of "Difficulty-Aware Simulator for Open Set...

24
Experimental
21 ViTAE-Transformer/SimDistill

The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal...

24
Experimental
22 twinkle0331/LGTM

[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning...

24
Experimental
23 IPL-sharif/KD_Survey

A Comprehensive Survey on Knowledge Distillation

23
Experimental
24 Smooth-humvee686/onpolicydistillation

🛠️ Apply on-policy distillation to enhance Qwen3-0.6b's performance on GSM8K...

23
Experimental
25 lyxiang-casia/EKD

Official Implementation of Evidential Knowledge Distillation (ICCV 2025)

23
Experimental
26 juyongjiang/KaSA

[ICLR'25] Code for KaSA, an official implementation of "KaSA:...

22
Experimental
27 ismail31416/LumiNet

The official (TMLR) implementation of LumiNet: Perception-Driven Knowledge...

22
Experimental
28 King-Rafat/STKD_CFMitigation

Mitigating carbon footprint for knowledge distillation based deep learning...

17
Experimental
29 adrianrm99/separating_knowledge

[ICML 2025] Separating Knowledge with Procedural Data

17
Experimental
30 AsafShul/PoDD

Official PyTorch Implementation for the "Distilling Datasets Into Less Than...

16
Experimental
31 mashijie1028/TrustDD

(Pattern Recognition 2025) Towards Trustworthy Dataset Distillation

14
Experimental
32 nphdang/FS-BBT

Black-box Few-shot Knowledge Distillation

12
Experimental