NivAm12/Enhancing-By-Subtasks-Components
This project aims to tackle data scarcity in a specific task by training a single base model with multiple heads, each dedicated to a different NLP task. These "supporting tasks" aim to leverage shared knowledge across domains, enhancing the model's performance and robustness.
No commits in the last 6 months.
Stars
1
Forks
1
Language
Python
License
—
Category
Last pushed
Jul 31, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/NivAm12/Enhancing-By-Subtasks-Components"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
JayYip/m3tl
BERT for Multitask Learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.