Knowledge Distillation Compression Transformer Models

There are 2 knowledge distillation compression models tracked. The highest-rated is microsoft/AdaMix at 30/100 with 138 stars.

Get all 2 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=knowledge-distillation-compression&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Model Score Tier
1 microsoft/AdaMix

This is the implementation of the paper AdaMix: Mixture-of-Adaptations for...

30
Emerging
2 pphuc25/distil-cd

Distillation Contrastive Decoding: Improving LLMs Reasoning with Contrastive...

11
Experimental