Knowledge Distillation Compression Diffusion Models
There are 2 knowledge distillation compression models tracked. The highest-rated is amazon-science/crossnorm-selfnorm at 27/100 with 128 stars.
Get all 2 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=diffusion&subcategory=knowledge-distillation-compression&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Model | Score | Tier |
|---|---|---|---|
| 1 |
amazon-science/crossnorm-selfnorm
CrossNorm and SelfNorm for Generalization under Distribution Shifts, ICCV 2021 |
|
Experimental |
| 2 |
qitianwu/DIFFormer
The official implementation for ICLR23 spotlight paper "DIFFormer: Scalable... |
|
Experimental |