pytorch_warmup and pytorch-gradual-warmup-lr

These are competitors offering similar learning rate warmup scheduling functionality for PyTorch, with the second providing a more feature-rich gradual warmup approach compared to the first's simpler warmup implementation.

pytorch_warmup
50
Established
Maintenance 2/25
Adoption 10/25
Maturity 25/25
Community 13/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 21/25
Stars: 415
Forks: 23
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 991
Forks: 126
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m
Stale 6m No Package No Dependents

About pytorch_warmup

Tony-Y/pytorch_warmup

Learning Rate Warmup in PyTorch

About pytorch-gradual-warmup-lr

ildoonet/pytorch-gradual-warmup-lr

Gradually-Warmup Learning Rate Scheduler for PyTorch

Implements a composable learning rate scheduler that linearly increases the learning rate over a warmup phase before delegating to a secondary scheduler (e.g., StepLR, ExponentialLR, CosineAnnealingLR) for the main training phase. Chains with PyTorch's native `torch.optim.lr_scheduler` interface through an `after_scheduler` parameter, enabling flexible scheduler composition without modifying optimizer code. Based on the large-batch SGD warmup strategy from Goyal et al., it prevents training instability during initial iterations when using aggressive learning rates or large batch sizes.

Scores updated daily from GitHub, PyPI, and npm data. How scores work