Awesome-Dataset-Distillation and awesome-knowledge-distillation

These are ecosystem siblings within knowledge compression—one curates papers on **dataset distillation** (compressing training data itself) while the other covers **knowledge distillation** (compressing trained models), representing two complementary but distinct subfields of model compression research.

Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 19/25
Maintenance 9/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 1,909
Forks: 170
Downloads:
Commits (30d): 62
Language: HTML
License: MIT
Stars: 3,825
Forks: 513
Downloads:
Commits (30d): 1
Language:
License: Apache-2.0
No Package No Dependents
No Package No Dependents

About Awesome-Dataset-Distillation

Guang000/Awesome-Dataset-Distillation

A curated list of awesome papers on dataset distillation and related applications.

About awesome-knowledge-distillation

dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

A curated collection of knowledge distillation research spanning foundational ensemble methods through modern techniques like attention transfer, dark knowledge, and data-free distillation. The resource catalogs papers across diverse applications—from model compression and adversarial robustness to sequence-level learning and object detection—covering architectural approaches including FitNets, privileged information transfer, and mutual learning schemes. Organized by methodology and application domain, it serves as a comprehensive reference for techniques to transfer knowledge from large teacher models to efficient student networks across vision, NLP, and speech domains.

Scores updated daily from GitHub, PyPI, and npm data. How scores work