tomaarsen/attention_sinks

Extend existing LLMs way beyond the original training length with constant memory usage, without retraining

53
/ 100
Established

736 stars and 53 monthly downloads. No commits in the last 6 months. Available on PyPI.

Stale 6m
Maintenance 0 / 25
Adoption 14 / 25
Maturity 25 / 25
Community 14 / 25

How are scores calculated?

Stars

736

Forks

45

Language

Python

License

Apache-2.0

Last pushed

Apr 10, 2024

Monthly downloads

53

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/tomaarsen/attention_sinks"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.