yoshitomo-matsubara/torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at TPAMI, CVPR, ICLR, ECCV, NeurIPS, ICCV, AAAI, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

66
/ 100
Established

1,601 stars and 374 monthly downloads. Available on PyPI.

Maintenance 6 / 25
Adoption 16 / 25
Maturity 25 / 25
Community 19 / 25

How are scores calculated?

Stars

1,601

Forks

143

Language

Python

License

MIT

Last pushed

Dec 24, 2025

Monthly downloads

374

Commits (30d)

0

Dependencies

6

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/yoshitomo-matsubara/torchdistill"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.