prakhar21/Writing-with-BERT
Using BERT for doing the task of Conditional Natural Language Generation by fine-tuning pre-trained BERT on custom dataset.
No commits in the last 6 months.
Stars
41
Forks
12
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Feb 18, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/prakhar21/Writing-with-BERT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
JayYip/m3tl
BERT for Multitask Learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.