Lightning-AI/litgpt

20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.

78
/ 100
Verified

Implements models from scratch without abstraction layers and combines Flash Attention with FSDP for distributed training across 1-1000+ GPUs/TPUs. Supports parameter-efficient finetuning via LoRA/QLoRA with mixed-precision quantization (fp4/8/16/32) to reduce GPU memory requirements, while integrating with PyTorch Lightning and Lightning Cloud infrastructure for end-to-end pretraining, finetuning, and deployment workflows through declarative YAML recipes.

13,225 stars and 15,196 monthly downloads. Actively maintained with 5 commits in the last 30 days. Available on PyPI.

Maintenance 13 / 25
Adoption 20 / 25
Maturity 25 / 25
Community 20 / 25

How are scores calculated?

Stars

13,225

Forks

1,409

Language

Python

License

Apache-2.0

Last pushed

Mar 06, 2026

Monthly downloads

15,196

Commits (30d)

5

Dependencies

8

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Lightning-AI/litgpt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.