kyegomez/AoA-torch

Implementation of Attention on Attention in Zeta

39
/ 100
Emerging

This project provides an implementation of the Attention on Attention (AoA) mechanism, a component used in advanced deep learning models. It takes an input sequence of data (like numerical representations of text or images) and processes it to produce an output sequence with enhanced contextual understanding. This is useful for researchers and practitioners building custom neural networks, particularly in areas like computer vision or natural language processing, who need to integrate specific attention mechanisms.

Available on PyPI.

Use this if you are a machine learning researcher or engineer designing and experimenting with custom deep learning architectures that require a specific attention mechanism for improved performance.

Not ideal if you are a casual user looking for an out-of-the-box solution to a specific problem without needing to build or modify neural network components.

deep-learning neural-networks computer-vision natural-language-processing model-architecture
No Dependents
Maintenance 10 / 25
Adoption 4 / 25
Maturity 25 / 25
Community 0 / 25

How are scores calculated?

Stars

5

Forks

Language

Python

License

MIT

Last pushed

Feb 16, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/AoA-torch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.