pbloem/former

Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)

49
/ 100
Emerging

Built entirely in PyTorch without external NLP libraries, this implementation covers the complete transformer stack including multi-head attention, positional encoding, and feed-forward layers with detailed comments explaining each component. The codebase prioritizes educational clarity over performance optimization, making it useful for understanding transformer mechanics rather than production deployment.

1,092 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 23 / 25

How are scores calculated?

Stars

1,092

Forks

172

Language

Python

License

MIT

Last pushed

Mar 20, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/pbloem/former"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.