Lightning-AI/pytorch-lightning
Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
Abstracts away distributed training infrastructure (mixed precision, gradient accumulation, multi-GPU synchronization) through a `LightningModule` API while preserving direct access to PyTorch code. Offers a control spectrum via Lightning Fabric for experts who need lower-level hooks into the training loop without full abstraction.
30,923 stars and 16,568,627 monthly downloads. Used by 83 other packages. Actively maintained with 43 commits in the last 30 days. Available on PyPI.
Stars
30,923
Forks
3,683
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 10, 2026
Monthly downloads
16,568,627
Commits (30d)
43
Dependencies
8
Reverse dependents
83
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Lightning-AI/pytorch-lightning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
pytorch/pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
keras-team/keras
Deep Learning for humans
Lightning-AI/torchmetrics
Machine learning metrics for distributed, scalable PyTorch applications.
lanpa/tensorboardX
tensorboard for pytorch (and chainer, mxnet, numpy, ...)
rwth-i6/returnn
The RWTH extensible training framework for universal recurrent neural networks