kuleshov-group/mdlm
[NeurIPS 2024] Simple and Effective Masked Diffusion Language Model
Combines discrete diffusion with a substitution-based parameterization that reduces to masked language modeling losses, enabling competitive perplexity on LM1B and OpenWebText. Implements multiple samplers including a novel cached variant achieving 3-4× speedup over prior diffusion LMs, plus semi-autoregressive generation at 25-30× faster decoding than comparable models. Supports DiT and Mamba architectures, integrates with Hugging Face Hub, and provides complete training pipelines via PyTorch Lightning with SLURM job scripts.
657 stars. No commits in the last 6 months.
Stars
657
Forks
91
Language
Python
License
Apache-2.0
Category
Last pushed
Sep 29, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/kuleshov-group/mdlm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
FlorianFuerrutter/genQC
Generative Quantum Circuits
horseee/DeepCache
[CVPR 2024] DeepCache: Accelerating Diffusion Models for Free
Gen-Verse/MMaDA
MMaDA - Open-Sourced Multimodal Large Diffusion Language Models (dLLMs with block diffusion,...
Shark-NLP/DiffuSeq
[ICLR'23] DiffuSeq: Sequence to Sequence Text Generation with Diffusion Models
jeongwhanchoi/SCONE
"SCONE: A Novel Stochastic Sampling to Generate Contrastive Views and Hard Negative Samples for...