T5 mT5 Fine-tuning Transformer Models
Tools and frameworks for training, fine-tuning, and adapting T5 and mT5 transformer models for specific tasks (paraphrasing, simplification, language identification, domain-specific applications). Does NOT include general question-answering systems, content detection, or applications that use pre-trained models without fine-tuning code.
There are 24 t5 mt5 fine-tuning models tracked. 1 score above 50 (established tier). The highest-rated is Shivanandroy/simpleT5 at 54/100 with 400 stars and 309 monthly downloads.
Get all 24 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=t5-mt5-fine-tuning&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Model | Score | Tier |
|---|---|---|---|
| 1 |
Shivanandroy/simpleT5
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets... |
|
Established |
| 2 |
chanind/frame-semantic-transformer
Frame Semantic Parser based on T5 and FrameNet |
|
Emerging |
| 3 |
Shivanandroy/KeyPhraseTransformer
KeyPhraseTransformer lets you quickly extract key phrases, topics, themes... |
|
Emerging |
| 4 |
conceptofmind/t5-pytorch
Implementation of Exploring the Limits of Transfer Learning with a Unified... |
|
Emerging |
| 5 |
osainz59/t5-encoder
A extension of Transformers library to include T5ForSequenceClassification class. |
|
Emerging |
| 6 |
0x7o/text2keywords
Trained T5 and T5-large model for creating keywords from text |
|
Experimental |
| 7 |
IliesChibane/Text-Combining
A final year master project that consist of creating a text combining api |
|
Experimental |
| 8 |
Qingfeng-233/KeyAtten
KeyAtten: Attention-based Zero-Shot Keyword & Keyphrase Extraction |
|
Experimental |
| 9 |
awpggexcutor-beep/T5-Refiner-DomainFocus
🌟 Enhance T5 model performance with domain-specific word masking for... |
|
Experimental |
| 10 |
llap4585/T5-Refiner-DomainFocus
Derived from Medical Literature Development: Injecting domain expertise into... |
|
Experimental |
| 11 |
llap4585/T5-Refiner-DomainFocus-TrainOnly
This project provides code for fine-tuning T5/mT5 models on data... |
|
Experimental |
| 12 |
leonjovanovic/keywords-extraction
Keyword extraction using Scake, KeyBERT, Fine-tuning Transformer BERT-like... |
|
Experimental |
| 13 |
rachel-pai/T5Elasticsearch
Elasticsearch with T5/Bert/Other models provided by huggingface Transfomers. |
|
Experimental |
| 14 |
ThaminduR/mt5-simplification
Scripts related to training and predicting Google's mt5 model |
|
Experimental |
| 15 |
KrishnanJothi/MT5_Language_identification_NLP
MT5-small is fine-tuned on the downstream task of Natural Language... |
|
Experimental |
| 16 |
gabriellst/paraphrase.ia
paraphrase.ia is a Chrome extension that let's you make paraphrases of a... |
|
Experimental |
| 17 |
devrahulbanjara/Nepali-Language-Paraphraser-in-Devanagari
A Nepali paraphrasing system using NLP, built with Transformer-based models... |
|
Experimental |
| 18 |
vikasnair76/text-simplification-using-transformer-models
A transformer-based NLP project for text simplification using BART, T5, and... |
|
Experimental |
| 19 |
dane-meister/TellMeWhy-Context-Injection
Fine-tunes a T5-small model on the TellMeWhy dataset using context injection... |
|
Experimental |
| 20 |
chrislemke/deep-martin
Text simplification for a better world: Deep-Martin Transformer 🤗 |
|
Experimental |
| 21 |
hermanpetrov/KeyBERT-Estonian-setup
This is setup for Estonian text use of keyword extraction with KeyBERT. The... |
|
Experimental |
| 22 |
azizbarank/Czech-T5-Base-Model
This is the t5 base model for the Czech that is based on the smaller version... |
|
Experimental |
| 23 |
yjg30737/pyqt-paraphrase-model-using-gui
Using paraphrase(text2text generation) model from huggingface in Python desktop app |
|
Experimental |
| 24 |
utkarshranaa/Paraphraser.io
Paraphraser.io is a T5 transformer-based paraphrase generation model... |
|
Experimental |