vraun0/Transformer

Implementation of the paper Attention Is All You Need (2017) in Pytorch, with the multihead attention layers and the encoder and decoder blocks implemented from scratch. The model was trained on the IWSLT 2017 dataset for english to italian translation.

11
/ 100
Experimental

No commits in the last 6 months.

No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 2 / 25
Maturity 7 / 25
Community 0 / 25

How are scores calculated?

Stars

2

Forks

Language

Python

License

Last pushed

Aug 12, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/vraun0/Transformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.