Gpt Multilingual Training Transformer Models
There are 6 gpt multilingual training models tracked. The highest-rated is AliHaiderAhmad001/GPT-from-Scratch-with-Tensorflow at 43/100 with 19 stars.
Get all 6 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=gpt-multilingual-training&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Model | Score | Tier |
|---|---|---|---|
| 1 |
AliHaiderAhmad001/GPT-from-Scratch-with-Tensorflow
Implementation for "Improving Language Understanding by Generative... |
|
Emerging |
| 2 |
HomebrewML/HomebrewNLP-torch
A case study of efficient training of large language models using commodity hardware. |
|
Emerging |
| 3 |
nawnoes/pytorch-gpt-x
An implementation of an autoregressive language model using an improved... |
|
Emerging |
| 4 |
qiqiApink/MotionGPT
The official PyTorch implementation of the paper "MotionGPT: Finetuned LLMs... |
|
Emerging |
| 5 |
akshat0123/GPT-1
Pytorch implementation of GPT-1 |
|
Emerging |
| 6 |
Shenggan/atp
Adaptive Tensor Parallelism for Foundation Models |
|
Experimental |