Dao-AILab/flash-attention

Fast and memory-efficient exact attention

86
/ 100
Verified

23,131 stars. Used by 18 other packages. Actively maintained with 103 commits in the last 30 days. Available on PyPI.

Maintenance 25 / 25
Adoption 15 / 25
Maturity 25 / 25
Community 21 / 25

How are scores calculated?

Stars

23,131

Forks

2,583

Language

Python

License

BSD-3-Clause

Last pushed

Apr 04, 2026

Commits (30d)

103

Dependencies

2

Reverse dependents

18

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Dao-AILab/flash-attention"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.