mytechnotalent/RENN
Inspired by Andrej Karpathy’s micrograd, this lecture builds a neural network from scratch, manually deriving gradients, automating backpropagation, and leveraging the TANH activation for nonlinearity. We bridge to PyTorch, demonstrating gradient descent’s power to minimize loss and reveal neural network fundamentals.
No commits in the last 6 months.
Stars
2
Forks
—
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Nov 22, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/mytechnotalent/RENN"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
digantamisra98/Mish
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
itdxer/neupy
NeuPy is a Tensorflow based python library for prototyping and building neural networks
Sentdex/nnfs_book
Sample code from the Neural Networks from Scratch book.
vzhou842/cnn-from-scratch
A Convolutional Neural Network implemented from scratch (using only numpy) in Python.
nicklashansen/rnn_lstm_from_scratch
How to build RNNs and LSTMs from scratch with NumPy.