google-research/bert
TensorFlow code and pre-trained models for BERT
ArchivedProvides 24 compact model variants (2-12 layers, 128-768 hidden units) optimized for resource-constrained environments, validated on GLUE benchmarks with knowledge distillation support. Introduces Whole Word Masking during pre-training to improve token prediction difficulty and features TensorFlow Hub integration for streamlined fine-tuning across downstream NLP tasks like text classification and semantic similarity.
39,918 stars. No commits in the last 6 months.
Stars
39,918
Forks
9,711
Language
Python
License
Apache-2.0
Category
Last pushed
Jul 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/google-research/bert"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
JayYip/m3tl
BERT for Multitask Learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.