pytorch_warmup and pytorch-gradual-warmup-lr
These are competitors offering similar learning rate warmup scheduling functionality for PyTorch, with the second providing a more feature-rich gradual warmup approach compared to the first's simpler warmup implementation.
About pytorch_warmup
Tony-Y/pytorch_warmup
Learning Rate Warmup in PyTorch
About pytorch-gradual-warmup-lr
ildoonet/pytorch-gradual-warmup-lr
Gradually-Warmup Learning Rate Scheduler for PyTorch
Implements a composable learning rate scheduler that linearly increases the learning rate over a warmup phase before delegating to a secondary scheduler (e.g., StepLR, ExponentialLR, CosineAnnealingLR) for the main training phase. Chains with PyTorch's native `torch.optim.lr_scheduler` interface through an `after_scheduler` parameter, enabling flexible scheduler composition without modifying optimizer code. Based on the large-batch SGD warmup strategy from Goyal et al., it prevents training instability during initial iterations when using aggressive learning rates or large batch sizes.
Scores updated daily from GitHub, PyPI, and npm data. How scores work