Rohan-Thoma/Coding-attention-from-scratch

This repository consists code for executing attention mechanism from scratch for language translation models. This is coded from ground up for translating Italian to English completely without any pretraining.

11
/ 100
Experimental

No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 2 / 25
Maturity 9 / 25
Community 0 / 25

How are scores calculated?

Stars

2

Forks

Language

Jupyter Notebook

License

Apache-2.0

Last pushed

May 04, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Rohan-Thoma/Coding-attention-from-scratch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.