seq2seq and Seq2Seq-PyTorch

These are competitors—both are standalone, educational implementations of sequence-to-sequence models with attention mechanisms for NMT, designed to teach or prototype the same architecture rather than work together or serve different roles in a shared ecosystem.

seq2seq
51
Established
Seq2Seq-PyTorch
51
Established
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 25/25
Stars: 703
Forks: 168
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 742
Forks: 161
Downloads:
Commits (30d): 0
Language: Python
License: WTFPL
Archived Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About seq2seq

keon/seq2seq

Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch

About Seq2Seq-PyTorch

MaximumEntropy/Seq2Seq-PyTorch

Sequence to Sequence Models with PyTorch

This project helps machine learning engineers and researchers build and experiment with sequence-to-sequence models for tasks like machine translation. It takes sequences of words or characters in one language as input and produces translated sequences in another. The implementations cover standard and attention-based models, providing a foundation for natural language processing applications.

Machine Translation Natural Language Processing Deep Learning Research AI Model Development

Scores updated daily from GitHub, PyPI, and npm data. How scores work