Lightning-AI/litgpt
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
This project helps machine learning engineers and researchers build custom large language models. You can select from over 20 pre-built LLMs, feed in your specific datasets for training or fine-tuning, and then deploy these models for various applications. It's designed for users who need fine-grained control and high performance for their custom AI language tasks.
13,225 stars. Actively maintained with 11 commits in the last 30 days. Available on PyPI.
Use this if you are a machine learning engineer or researcher needing to pretrain, fine-tune, or deploy large language models with complete control and optimized performance on GPUs or TPUs.
Not ideal if you are looking for a simple, off-the-shelf LLM API without needing to manage the underlying model architecture or training process.
Stars
13,225
Forks
1,409
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 06, 2026
Commits (30d)
11
Dependencies
8
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Lightning-AI/litgpt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Recent Releases
Related models
AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mosaicml/llm-foundry
LLM training code for Databricks foundation models
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
rickiepark/llm-from-scratch
<밑바닥부터 만들면서 공부하는 LLM>(길벗, 2025)의 코드 저장소