JonathanRaiman/theano_lstm
:microscope: Nano size Theano LSTM module
Provides modular layer types (LSTM, RNN, Embedding, GatedInput) built on Theano's symbolic computation graph with support for stacked architectures and backpropagation through time via `theano.scan`. Implements gradient stabilization through element-wise clipping and Adadelta optimization to handle vanishing/exploding gradients in sequence modeling. Handles variable-length sequences through masked loss computation, enabling efficient minibatch training without sequence length uniformity constraints.
303 stars and 27 monthly downloads. No commits in the last 6 months. Available on PyPI.
Stars
303
Forks
111
Language
Python
License
—
Category
Last pushed
Nov 16, 2016
Monthly downloads
27
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/JonathanRaiman/theano_lstm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
google/tangent
Source-to-Source Debuggable Derivatives in Pure Python
ahrefs/ocannl
OCANNL: OCaml Compiles Algorithms for Neural Networks Learning
pranftw/neograd
A deep learning framework created from scratch with Python and NumPy
statusfailed/catgrad
a categorical deep learning compiler
mstksg/backprop
Heterogeneous automatic differentiation ("backpropagation") in Haskell