liangyuwang/train-large-model-from-scratch
A minimal, hackable pre-training stack for GPT-style language models
23
/ 100
Experimental
No Package
No Dependents
Maintenance
10 / 25
Adoption
4 / 25
Maturity
9 / 25
Community
0 / 25
Stars
7
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 21, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/liangyuwang/train-large-model-from-scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Lightning-AI/litgpt
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
81
SPUTNIKAI/LeechTransformer
Leech-Lila: A Geometric Attention Transformer(Language Model) with the Leech Lattice Attention
37
liangyuwang/Tiny-DeepSpeed
Tiny-DeepSpeed, a minimalistic re-implementation of the DeepSpeed library
36
microsoft/Text2Grad
🚀 Text2Grad: Converting natural language feedback into gradient signals for precise model...
35
viralcode/superGPT
Train your own LLM from scratch
35