avijit-jana/SeqFlipAttention
SeqFlipAttention is a forward‑looking PyTorch demonstration of sequence‑to‑sequence learning enhanced by attention, trained on a synthetic reverse‑sequence task and complete with training scripts, loss and accuracy visualizations, and a quantitative analysis of attention’s impact on performance.
Stars
2
Forks
—
Language
Jupyter Notebook
License
AGPL-3.0
Category
Last pushed
Jan 07, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/avijit-jana/SeqFlipAttention"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
facebookresearch/fairseq2
FAIR Sequence Modeling Toolkit 2
OpenNMT/OpenNMT-tf
Neural machine translation and sequence learning using TensorFlow
lhotse-speech/lhotse
Tools for handling multimodal data in machine learning projects.
awslabs/sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
google/sequence-layers
A neural network layer API and library for sequence modeling, designed for easy creation of...