kuleshov-group/mdlm

[NeurIPS 2024] Simple and Effective Masked Diffusion Language Model

42
/ 100
Emerging

Combines discrete diffusion with a substitution-based parameterization that reduces to masked language modeling losses, enabling competitive perplexity on LM1B and OpenWebText. Implements multiple samplers including a novel cached variant achieving 3-4× speedup over prior diffusion LMs, plus semi-autoregressive generation at 25-30× faster decoding than comparable models. Supports DiT and Mamba architectures, integrates with Hugging Face Hub, and provides complete training pipelines via PyTorch Lightning with SLURM job scripts.

657 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 9 / 25
Community 21 / 25

How are scores calculated?

Stars

657

Forks

91

Language

Python

License

Apache-2.0

Last pushed

Sep 29, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/kuleshov-group/mdlm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.