aaaastark/Pretrain_Finetune_Transformers_Pytorch
Pre-Training and Fine-Tuning transformer models using PyTorch and the Hugging Face Transformers library. Whether you're delving into pre-training with custom datasets or fine-tuning for specific classification tasks, these notebooks offer explanations and code for implementation.
No commits in the last 6 months.
Stars
6
Forks
1
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Mar 13, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/aaaastark/Pretrain_Finetune_Transformers_Pytorch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
explosion/curated-transformers
🤖 A PyTorch library of curated Transformer models and their composable components
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
ruanchaves/hashformers
Accurate word segmentation for hashtags and text, powered by Transformers and Beam Search. A...