elbayadm/attn2d

Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction

46
/ 100
Emerging

Implements 2D convolutions that jointly encode source and target sequences for neural machine translation, enabling efficient decoding grids for wait-k simultaneous translation. Extends Fairseq with Transformer-based wait-k models featuring unidirectional encoders and joint multi-path training. Built on PyTorch with GPU acceleration via NCCL for distributed training.

498 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 20 / 25

How are scores calculated?

Stars

498

Forks

71

Language

Python

License

MIT

Last pushed

May 08, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/elbayadm/attn2d"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.