JackSteve-code/RNNS-to-transformers
A comprehensive technical survey and implementation guide of sequence modeling evolution. Features mathematical foundations and PyTorch implementations of RNNs, LSTMs, GRUs, and Transformers, exploring the transition from sequential recurrence to parallelized self-attention and modern LLM scaling laws.
Stars
—
Forks
—
Language
HTML
License
—
Category
Last pushed
Mar 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/JackSteve-code/RNNS-to-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers-bloom-inference
Fast Inference Solutions for BLOOM
Tencent/TurboTransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc)...
mit-han-lab/lite-transformer
[ICLR 2020] Lite Transformer with Long-Short Range Attention
mit-han-lab/hardware-aware-transformers
[ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
LibreTranslate/Locomotive
Toolkit for training/converting LibreTranslate compatible language models 🚂