GPT2 Language Models NLP Tools
Implementations, training guides, and fine-tuning tools for GPT-2 and GPT-2-based models across languages. Includes pretrained weights, training pipelines, and tutorials. Does NOT include other transformer architectures (BERT, T5), inference-only wrappers, or non-GPT2 generative models.
There are 43 gpt2 language models tools tracked. 1 score above 50 (established tier). The highest-rated is Morizeyao/GPT2-Chinese at 51/100 with 7,598 stars.
Get all 43 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=nlp&subcategory=gpt2-language-models&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Tool | Score | Tier |
|---|---|---|---|
| 1 |
Morizeyao/GPT2-Chinese
Chinese version of GPT2 training code, using BERT tokenizer. |
|
Established |
| 2 |
imcaspar/gpt2-ml
GPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型 |
|
Emerging |
| 3 |
graykode/gpt-2-Pytorch
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation |
|
Emerging |
| 4 |
gyunggyung/KoGPT2-FineTuning
🔥 Korean GPT-2, KoGPT2 FineTuning cased. 한국어 가사 데이터 학습 🔥 |
|
Emerging |
| 5 |
liucongg/GPT2-NewsTitle
Chinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。 |
|
Emerging |
| 6 |
lipiji/Guyu
Chinese GPT2: pre-training and fine-tuning framework for text generation |
|
Emerging |
| 7 |
CyberZHG/keras-gpt-2
Load GPT-2 checkpoint and generate texts |
|
Emerging |
| 8 |
seanie12/neural-question-generation
Pytorch implementation of Paragraph-level Neural Question Generation with... |
|
Emerging |
| 9 |
devjwsong/gpt2-dialogue-generation-pytorch
The PyTorch implementation of fine-tuning the GPT-2(Generative Pre-trained... |
|
Emerging |
| 10 |
LALITCHAROLA/genr-kit
🚀 Prototype and deploy generative AI applications with ease using Python,... |
|
Emerging |
| 11 |
watermelon-3012/gp_image_to_latex
LaTeX Generation for Mathematical Expressions: An experiment with different models |
|
Emerging |
| 12 |
gyunggyung/LFM2-KoEn-Tuning
Fine-tuning LFM2-1.2B for Korean-English bidirectional translation.... |
|
Emerging |
| 13 |
vlarine/ruGPT2
Russian GPT2 model |
|
Emerging |
| 14 |
snoop2head/KoGPT-Joong-2
✍️ Finetuning KoGPT-Trinity to write acrostic poem | 중2 감성 N행시 빌더 |
|
Experimental |
| 15 |
ADGEfficiency/creative-writing-with-gpt2
Fine tune GPT-2 with your favourite authors |
|
Experimental |
| 16 |
zhenhao-huang/CPM-1-Finetune-Text-Generation
Finetune CPM-1 For Text Generation |
|
Experimental |
| 17 |
midway2333/Tower-GPT
一个非常简单的gpt实现 |
|
Experimental |
| 18 |
datares/ez-gpt
plug and play training of GPT-2 based language models |
|
Experimental |
| 19 |
quentinlintz/synthetic-data-generator
🦄 Use GPT to generate and label data |
|
Experimental |
| 20 |
ttop32/KoGPT2novel
Generate novel text - novel finetuned from skt KoGPT2 base v2 - 한국어 |
|
Experimental |
| 21 |
viewsetting/MindSpore-GPT2
An OpenAI's GPT2 implementation with MindSpore. The most updated version can... |
|
Experimental |
| 22 |
manthan89-py/AI-Blog-Writter
This project is used to generate a blog post using Natural Language... |
|
Experimental |
| 23 |
imsanjoykb/Text-Generation
The goal of this project is to detect the topic of the text and write a... |
|
Experimental |
| 24 |
WordsonRobert/joke-circuits
An "interpretable" low res joke generator |
|
Experimental |
| 25 |
b14ucky/Taco-LLMingway
Custom GPT Transformer architecture built from scratch in PyTorch. Trained... |
|
Experimental |
| 26 |
rahul-jha98/JustJoking.ai
Using a Transformer for learning the Language Model and Generate Short Jokes |
|
Experimental |
| 27 |
Pushkar1853/nanoGPT
The simple repository for training/finetuning medium-sized GPTs. |
|
Experimental |
| 28 |
Daddy-Myth/fine-tuning-gpt2-for-latex-generation
We’ll be fine-tuning GPT-2 using a specially designed prompt to teach this... |
|
Experimental |
| 29 |
aashrafh/anees-dataset
The dataset used to fine-tune the GPT-2 model used in Anees for the... |
|
Experimental |
| 30 |
hakancangunerli/sun-tzu
GPT2 finetuned with Art of War by Sun Tzu |
|
Experimental |
| 31 |
Dai-Wenxun/Pointer-Generator-Networks
Pytorch implementation of "Get To The Point: Summarization with... |
|
Experimental |
| 32 |
Koziev/vector2text
Generate Russian text using GPT model given LaBSE text embedding vector |
|
Experimental |
| 33 |
NikSchaefer/Quotes-Generation
Creating captivating quotes from a single word input in PyTorch |
|
Experimental |
| 34 |
SRM47/gpt2-paraphraser-comparisons
Finetune GPT-2 models for paraphrasing and compare its outputs with other... |
|
Experimental |
| 35 |
IndexFziQ/LongLM-Eyas
Implement of IIE-NLP-Eyas@OutGen: Chinese Outline-guided Story Generation... |
|
Experimental |
| 36 |
crackalamoo/bardgpt
BardGPT is a miniature GPT-style model for generating poetry, coded from... |
|
Experimental |
| 37 |
alinrajpoot/genr-kit
Genr-Kit: The ultimate open-source playground for multi-modal AI. One... |
|
Experimental |
| 38 |
RohitKrish46/huberman-GPT
A GPT-2-style character level language model emulating Andrew Huberman's... |
|
Experimental |
| 39 |
dheeren-tejani/mini-lm-124m
Experimental GPT-2 scale (~124M param) LLM trained from scratch on Google... |
|
Experimental |
| 40 |
sathishkumar67/GPT2-Pretraining_Finewebedu10B
Pretraining the 124m GPT2 model |
|
Experimental |
| 41 |
ianomunga/backTrack
Reverse engineering STTM for backtracking texts to their source Language... |
|
Experimental |
| 42 |
boru-roylu/DialGenModel
The offical PyTorch implementation of the models used in DialGen paper. |
|
Experimental |
| 43 |
sghawana/custom-gpt2-tokenizer
I built a gpt-2 style tokenizer that can be trained on any .txt data to... |
|
Experimental |