Lightning-AI/pytorch-lightning

Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.

94
/ 100
Verified

Abstracts away distributed training infrastructure (mixed precision, gradient accumulation, multi-GPU synchronization) through a `LightningModule` API while preserving direct access to PyTorch code. Offers a control spectrum via Lightning Fabric for experts who need lower-level hooks into the training loop without full abstraction.

30,923 stars and 16,568,627 monthly downloads. Used by 83 other packages. Actively maintained with 43 commits in the last 30 days. Available on PyPI.

Maintenance 23 / 25
Adoption 25 / 25
Maturity 25 / 25
Community 21 / 25

How are scores calculated?

Stars

30,923

Forks

3,683

Language

Python

License

Apache-2.0

Last pushed

Mar 10, 2026

Monthly downloads

16,568,627

Commits (30d)

43

Dependencies

8

Reverse dependents

83

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Lightning-AI/pytorch-lightning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.