fvalerii/nmt-seq2seq-translation
🚀 High-performance NMT study scaling Seq2Seq LSTMs to 200k+ sentence pairs. Features a streaming tf.data pipeline, Transfer Learning (NNLM), and masked loss. Reaches 17.32 BLEU on English-to-German translation. Developed for Imperial College London's TensorFlow Certification.
Stars
—
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jan 30, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/fvalerii/nmt-seq2seq-translation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper