Gpt Multilingual Training Transformer Models

There are 6 gpt multilingual training models tracked. The highest-rated is AliHaiderAhmad001/GPT-from-Scratch-with-Tensorflow at 43/100 with 19 stars.

Get all 6 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=gpt-multilingual-training&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Model Score Tier
1 AliHaiderAhmad001/GPT-from-Scratch-with-Tensorflow

Implementation for "Improving Language Understanding by Generative...

43
Emerging
2 HomebrewML/HomebrewNLP-torch

A case study of efficient training of large language models using commodity hardware.

37
Emerging
3 nawnoes/pytorch-gpt-x

An implementation of an autoregressive language model using an improved...

34
Emerging
4 qiqiApink/MotionGPT

The official PyTorch implementation of the paper "MotionGPT: Finetuned LLMs...

31
Emerging
5 akshat0123/GPT-1

Pytorch implementation of GPT-1

31
Emerging
6 Shenggan/atp

Adaptive Tensor Parallelism for Foundation Models

14
Experimental