rashomon-gh/attention-visualiser

a module to visualise attention layer activations from transformer based models from huggingface

22
/ 100
Experimental
No Package No Dependents
Maintenance 10 / 25
Adoption 3 / 25
Maturity 9 / 25
Community 0 / 25

How are scores calculated?

Stars

3

Forks

Language

Python

License

MIT

Last pushed

Jan 07, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/rashomon-gh/attention-visualiser"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.