lvapeab/nmt-keras
Neural Machine Translation with Keras
Implements both attention-based RNN and Transformer architectures with multi-GPU training, beam search decoding, and ensemble inference with length/coverage normalization. Supports multiple attention mechanisms (Bahdanau, Luong, double stochastic), conditional GRU/LSTM units, interactive translation protocols, and integration with pretrained embeddings (GloVe, Word2Vec). Built on Keras/TensorFlow with client-server architecture for web deployment and Tensorboard monitoring.
531 stars. No commits in the last 6 months.
Stars
531
Forks
126
Language
Python
License
MIT
Category
Last pushed
Jul 30, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/lvapeab/nmt-keras"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
dair-ai/Transformers-Recipe
🧠A study guide to learn about Transformers
jaketae/ensemble-transformers
Ensembling Hugging Face transformers made easy
SirawitC/Transformer_from_scratch_pytorch
Build a transformer model from scratch using pytorch to understand its inner workings and gain...
lof310/transformer
PyTorch implementation of the current SOTA Transformer. Configurable, efficient, and...
jiangtaoxie/SoT
SoT: Delving Deeper into Classification Head for Transformer