Wojtekb30/GPT-2-B200-pre-trainier

Code for pre-training a GPT-2 model on (eight) NVIDIA DGX B200 GPUs and short tutorial on the topic. Uses Torch and HF Transformers. It can pre-train GPT-2 Small on 32 GB of data in around 2.5 hours. It handles dataset tokenization too.

11
/ 100
Experimental

No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 0 / 25
Maturity 9 / 25
Community 0 / 25

How are scores calculated?

Stars

Forks

Language

Python

License

MIT

Last pushed

Sep 06, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Wojtekb30/GPT-2-B200-pre-trainier"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.