awesome-knowledge-distillation and Awesome-Knowledge-Distillation
These are **competitors** — both are curated collections of knowledge distillation papers and resources intended to serve the same purpose of surveying the field, with the first being English-focused and more actively maintained (higher star count), while the second is Chinese-annotated and covers a specific 2014-2021 timeframe.
About awesome-knowledge-distillation
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
A curated collection of knowledge distillation research spanning foundational ensemble methods through modern techniques like attention transfer, dark knowledge, and data-free distillation. The resource catalogs papers across diverse applications—from model compression and adversarial robustness to sequence-level learning and object detection—covering architectural approaches including FitNets, privileged information transfer, and mutual learning schemes. Organized by methodology and application domain, it serves as a comprehensive reference for techniques to transfer knowledge from large teacher models to efficient student networks across vision, NLP, and speech domains.
About Awesome-Knowledge-Distillation
FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work