attention_is_all_you_need and attention-is-all-you-need-paper

These are competitors—both are independent implementations of the same Transformer architecture from the seminal 2017 paper, so users would select one based on framework preference (Chainer vs. likely PyTorch/TensorFlow) rather than use them together.

Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 23/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 323
Forks: 66
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: BSD-3-Clause
Stars: 243
Forks: 54
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stale 6m No Package No Dependents
Stale 6m No Package No Dependents

About attention_is_all_you_need

soskek/attention_is_all_you_need

Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.

About attention-is-all-you-need-paper

brandokoch/attention-is-all-you-need-paper

Original transformer paper: Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.

Scores updated daily from GitHub, PyPI, and npm data. How scores work