ildoonet/pytorch-gradual-warmup-lr

Gradually-Warmup Learning Rate Scheduler for PyTorch

47
/ 100
Emerging

Implements a composable learning rate scheduler that linearly increases the learning rate over a warmup phase before delegating to a secondary scheduler (e.g., StepLR, ExponentialLR, CosineAnnealingLR) for the main training phase. Chains with PyTorch's native `torch.optim.lr_scheduler` interface through an `after_scheduler` parameter, enabling flexible scheduler composition without modifying optimizer code. Based on the large-batch SGD warmup strategy from Goyal et al., it prevents training instability during initial iterations when using aggressive learning rates or large batch sizes.

991 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

991

Forks

126

Language

Python

License

MIT

Last pushed

Oct 10, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/ildoonet/pytorch-gradual-warmup-lr"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.