SENATOROVAI/gradient-descent-sgd-solver-course

Stochastic Gradient Descent (SGD) is an optimization algorithm that updates model parameters iteratively using small, random subsets (batches) of data, rather than the entire dataset. It significantly speeds up training for large datasets, though it introduces noise that causes, in some cases, heavy fluctuations.deep learning/neural networks.solver

43
/ 100
Emerging
No Package No Dependents
Maintenance 10 / 25
Adoption 6 / 25
Maturity 9 / 25
Community 18 / 25

How are scores calculated?

Stars

17

Forks

14

Language

Jupyter Notebook

License

MIT

Last pushed

Mar 05, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SENATOROVAI/gradient-descent-sgd-solver-course"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.