SforAiDl/KD_Lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
652 stars and 33 monthly downloads. No commits in the last 6 months. Available on PyPI.
Stars
652
Forks
61
Language
Python
License
MIT
Category
Last pushed
Mar 01, 2023
Monthly downloads
33
Commits (30d)
0
Dependencies
26
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SforAiDl/KD_Lib"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
SakurajimaMaiii/ProtoKD
[ICASSP 2023] Prototype Knowledge Distillation for Medical Segmentation with Missing Modality
HikariTJU/LD
Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
yzd-v/FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)