bert and BERT-keras
Maintenance
0/25
Adoption
10/25
Maturity
16/25
Community
25/25
Maintenance
0/25
Adoption
10/25
Maturity
9/25
Community
25/25
Stars: 39,918
Forks: 9,711
Downloads: —
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 815
Forks: 191
Downloads: —
Commits (30d): 0
Language: Python
License: GPL-3.0
Archived
Stale 6m
No Package
No Dependents
Archived
Stale 6m
No Package
No Dependents
About bert
google-research/bert
TensorFlow code and pre-trained models for BERT
Provides 24 compact model variants (2-12 layers, 128-768 hidden units) optimized for resource-constrained environments, validated on GLUE benchmarks with knowledge distillation support. Introduces Whole Word Masking during pre-training to improve token prediction difficulty and features TensorFlow Hub integration for streamlined fine-tuning across downstream NLP tasks like text classification and semantic similarity.
About BERT-keras
Separius/BERT-keras
Keras implementation of BERT with pre-trained weights
Scores updated daily from GitHub, PyPI, and npm data. How scores work