JackSteve-code/RNNS-to-transformers
A comprehensive technical survey and implementation guide of sequence modeling evolution. Features mathematical foundations and PyTorch implementations of RNNs, LSTMs, GRUs, and Transformers, exploring the transition from sequential recurrence to parallelized self-attention and modern LLM scaling laws.
Stars
—
Forks
—
Language
HTML
License
—
Category
Last pushed
Mar 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/JackSteve-code/RNNS-to-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper