simpletransformers and transformers-tutorials

simpletransformers
75
Verified
Maintenance 2/25
Adoption 24/25
Maturity 25/25
Community 24/25
Maintenance 0/25
Adoption 10/25
Maturity 9/25
Community 25/25
Stars: 4,234
Forks: 721
Downloads: 52,813
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 859
Forks: 196
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stale 6m
Stale 6m No Package No Dependents

About simpletransformers

ThilinaRajapakse/simpletransformers

Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI

Wraps HuggingFace Transformers with task-specific model classes that standardize the train/eval/predict workflow across NLP and multi-modal applications. Built-in integrations with Weights & Biases enable experiment tracking, while support for any HuggingFace pretrained model (BERT, RoBERTa, T5, etc.) provides flexibility without lock-in. Dense retrieval, conversational AI, and encoder fine-tuning extend beyond typical classification pipelines.

About transformers-tutorials

abhimishra91/transformers-tutorials

Github repo with tutorials to fine tune transformers for diff NLP tasks

Scores updated daily from GitHub, PyPI, and npm data. How scores work