Awesome-Dataset-Distillation and Awesome-Knowledge-Distillation
These are ecosystem siblings within knowledge compression research: dataset distillation (A) and knowledge distillation (B) are related but distinct techniques that both compress machine learning models—one by reducing training data and the other by transferring learned representations between models.
Maintenance
25/25
Adoption
10/25
Maturity
16/25
Community
19/25
Maintenance
0/25
Adoption
10/25
Maturity
8/25
Community
21/25
Stars: 1,909
Forks: 170
Downloads: —
Commits (30d): 62
Language: HTML
License: MIT
Stars: 2,654
Forks: 335
Downloads: —
Commits (30d): 0
Language: —
License: —
No Package
No Dependents
No License
Stale 6m
No Package
No Dependents
About Awesome-Dataset-Distillation
Guang000/Awesome-Dataset-Distillation
A curated list of awesome papers on dataset distillation and related applications.
About Awesome-Knowledge-Distillation
FLHonker/Awesome-Knowledge-Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work