DataWorshipper/Machine_Translation
This repository implements a Machine Translation system from scratch using the Transformer architecture. Inspired by the "Attention is All You Need" paper, the model is designed to translate between languages by learning complex linguistic patterns using deep learning techniques
No commits in the last 6 months.
Stars
1
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Oct 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/DataWorshipper/Machine_Translation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
LowinLi/transformers-stream-generator
This is a text generation method which returns a generator, streaming out each token in...
jaymody/picoGPT
An unnecessarily tiny implementation of GPT-2 in NumPy.
ystemsrx/mini-nanoGPT
One-click training of your own GPT. Training a GPT has never been easier for beginners. /...
Eamon2009/Codeformer-A.I
A character-level GPT transformer built from scratch in PyTorch, trained on Linux kernel C...
kyegomez/AttentionGrid
A network of attention mechanisms at your fingertips. Unleash the potential of attention...