aytugyuruk/optimizer-comparisions-training-with-limited-epochs
Optimizer Comparison Study - Empirical analysis of SGD vs Adam performance on MNIST with various initialization and scheduler configurations
Stars
—
Forks
—
Language
Python
License
MIT
Category
Last pushed
Feb 08, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/aytugyuruk/optimizer-comparisions-training-with-limited-epochs"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
metaopt/torchopt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
SimplexLab/TorchJD
Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with...
clovaai/AdamP
AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
nschaetti/EchoTorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on...
gpauloski/kfac-pytorch
Distributed K-FAC preconditioner for PyTorch