haoyangzheng-ai/didi-instruct

[ICLR 2026] Discrete Diffusion Divergence Instruct (DiDi-Instruct)

45
/ 100
Emerging

Distills discrete diffusion language models into few-step students using integral KL-divergence minimization with grouped reward normalization and intermediate-state matching, achieving up to 64× speedup while maintaining or exceeding teacher perplexity. Targets masked diffusion model acceleration on OpenWebText and downstream tasks, with pre-trained checkpoints available on Hugging Face and integrations with PyTorch-based training pipelines.

153 stars.

No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 15 / 25
Community 10 / 25

How are scores calculated?

Stars

153

Forks

10

Language

Python

License

MIT

Last pushed

Mar 04, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/haoyangzheng-ai/didi-instruct"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.