awesome-knowledge-distillation and Awesome-Knowledge-Distillation

These are **competitors** — both are curated collections of knowledge distillation papers and resources intended to serve the same purpose of surveying the field, with the first being English-focused and more actively maintained (higher star count), while the second is Chinese-annotated and covers a specific 2014-2021 timeframe.

Maintenance 9/25
Adoption 10/25
Maturity 16/25
Community 22/25
Maintenance 0/25
Adoption 10/25
Maturity 8/25
Community 21/25
Stars: 3,825
Forks: 513
Downloads:
Commits (30d): 1
Language:
License: Apache-2.0
Stars: 2,654
Forks: 335
Downloads:
Commits (30d): 0
Language:
License:
No Package No Dependents
No License Stale 6m No Package No Dependents

About awesome-knowledge-distillation

dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

A curated collection of knowledge distillation research spanning foundational ensemble methods through modern techniques like attention transfer, dark knowledge, and data-free distillation. The resource catalogs papers across diverse applications—from model compression and adversarial robustness to sequence-level learning and object detection—covering architectural approaches including FitNets, privileged information transfer, and mutual learning schemes. Organized by methodology and application domain, it serves as a comprehensive reference for techniques to transfer knowledge from large teacher models to efficient student networks across vision, NLP, and speech domains.

About Awesome-Knowledge-Distillation

FLHonker/Awesome-Knowledge-Distillation

Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。

Scores updated daily from GitHub, PyPI, and npm data. How scores work