wooyeolbaek/attention-map-diffusers

🚀 Cross attention map tools for huggingface/diffusers

63
/ 100
Established

Enables extraction and visualization of attention weights across diffusion model layers during inference, supporting both UNet and DiT-based architectures across Stable Diffusion 2/3/3.5, FLUX, and other Diffusers models. Uses module replacement and forward hooks to capture cross-attention tensors at configurable timesteps and layers, then maps attention weights back to tokenized prompt text for per-token spatial visualization. Integrates directly with HuggingFace Diffusers pipelines through a simple `init_pipeline()` call, requiring minimal code changes to existing inference workflows.

397 stars and 87 monthly downloads. Available on PyPI.

Maintenance 10 / 25
Adoption 14 / 25
Maturity 25 / 25
Community 14 / 25

How are scores calculated?

Stars

397

Forks

28

Language

Python

License

MIT

Last pushed

Feb 02, 2026

Monthly downloads

87

Commits (30d)

0

Dependencies

7

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/wooyeolbaek/attention-map-diffusers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.