mcbieda/llm-from-scratch
1. A simple implementation of the LLM gpt-2 "from scratch" that allows easy modifications and experiments (based on "Build an LLM from Scratch" by Raschka). 2. Domain-adaptive pretraining (DAPT) using biomedical abstracts.
Stars
—
Forks
—
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Feb 26, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mcbieda/llm-from-scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
Lightning-AI/litgpt
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mosaicml/llm-foundry
LLM training code for Databricks foundation models
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...