BERT Model Deployment NLP Tools
Tools for deploying, serving, and running BERT models in production environments. Includes pre-training frameworks, model optimization, and inference APIs. Does NOT include fine-tuning BERT for specific downstream tasks (covered by task-specific categories) or general text similarity/matching applications.
There are 109 bert model deployment tools tracked. 8 score above 50 (established tier). The highest-rated is codertimo/BERT-pytorch at 65/100 with 6,517 stars and 238 monthly downloads.
Get all 109 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=nlp&subcategory=bert-model-deployment&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Tool | Score | Tier |
|---|---|---|---|
| 1 |
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation |
|
Established |
| 2 |
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning |
|
Established |
| 3 |
920232796/bert_seq2seq
pytorch实现 Bert... |
|
Established |
| 4 |
JayYip/m3tl
BERT for Multitask Learning |
|
Established |
| 5 |
google-research/bert
TensorFlow code and pre-trained models for BERT |
|
Established |
| 6 |
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using... |
|
Established |
| 7 |
gaphex/bert_experimental
code and supplementary materials for a series of Medium articles about the BERT model |
|
Established |
| 8 |
ymcui/Chinese-BERT-wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型) |
|
Established |
| 9 |
guotong1988/BERT-pre-training
multi-gpu pre-training in one machine for BERT without horovod (Data Parallelism) |
|
Emerging |
| 10 |
yongzhuo/Macadam
Macadam是一个以Tensorflow(Keras)和bert4keras为基础,专注于文本分类、序列标注和关系抽取的自然语言处理工具包。支持RAND... |
|
Emerging |
| 11 |
MuQiuJun-AI/bert4pytorch
超轻量级bert的pytorch版本,大量中文注释,容易修改结构,持续更新 |
|
Emerging |
| 12 |
ymcui/Chinese-XLNet
Pre-Trained Chinese XLNet(中文XLNet预训练模型) |
|
Emerging |
| 13 |
Separius/BERT-keras
Keras implementation of BERT with pre-trained weights |
|
Emerging |
| 14 |
lonePatient/albert_pytorch
A Lite Bert For Self-Supervised Learning Language Representations |
|
Emerging |
| 15 |
microsoft/AzureML-BERT
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine... |
|
Emerging |
| 16 |
ymcui/Chinese-ELECTRA
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型) |
|
Emerging |
| 17 |
MartinoMensio/spacy-sentence-bert
Sentence transformers models for SpaCy |
|
Emerging |
| 18 |
instadeepai/tunbert
TunBERT is the first release of a pre-trained BERT model for the Tunisian... |
|
Emerging |
| 19 |
huggingface/hmtl
🌊HMTL: Hierarchical Multi-Task Learning - A State-of-the-Art neural network... |
|
Emerging |
| 20 |
ShannonAI/glyce
Code for NeurIPS 2019 - Glyce: Glyph-vectors for Chinese Character Representations |
|
Emerging |
| 21 |
BI4O/rasa_milktea_chatbot
Chatbot with bert chinese model, base on rasa framework(中文聊天机器人,结合bert意图分析,基于rasa框架) |
|
Emerging |
| 22 |
jasonwu0731/ToD-BERT
Pre-Trained Models for ToD-BERT |
|
Emerging |
| 23 |
practicingman/bert_serving
export bert model for serving |
|
Emerging |
| 24 |
imgarylai/bert-embedding
🔡 Token level embeddings from BERT model on mxnet and gluonnlp |
|
Emerging |
| 25 |
xiongma/chinese-law-bert-similarity
bert chinese similarity |
|
Emerging |
| 26 |
zliucr/coach
Coach: A Coarse-to-Fine Approach for Cross-domain Slot Filling (ACL-2020) |
|
Emerging |
| 27 |
Lipairui/textgo
Text preprocessing, representation, similarity calculation, text search and... |
|
Emerging |
| 28 |
alisafaya/Arabic-BERT
Arabic edition of BERT pretrained language models |
|
Emerging |
| 29 |
autoliuweijie/K-BERT
Source code of K-BERT (AAAI2020) |
|
Emerging |
| 30 |
mlwithme/BertWithPretrained
An implementation of the BERT model and its related downstream tasks based... |
|
Emerging |
| 31 |
writer/fitbert
Use BERT to Fill in the Blanks |
|
Emerging |
| 32 |
prakhar21/Writing-with-BERT
Using BERT for doing the task of Conditional Natural Language Generation by... |
|
Emerging |
| 33 |
sudharsan13296/Getting-Started-with-Google-BERT
Build and train state-of-the-art natural language processing models using BERT |
|
Emerging |
| 34 |
zhusleep/pytorch_chinese_lm_pretrain
pytorch中文语言模型预训练 |
|
Emerging |
| 35 |
lonePatient/MobileBert_PyTorch
MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices |
|
Emerging |
| 36 |
jonathanbratt/RBERT
Implementation of BERT in R |
|
Emerging |
| 37 |
ShuHuang/batterybert
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement |
|
Emerging |
| 38 |
ymcui/LERT
LERT: A Linguistically-motivated Pre-trained Language Model(语言学信息增强的预训练模型LERT) |
|
Emerging |
| 39 |
rokid/ELMo-chinese
Deep contextualized word representations for Chinese |
|
Emerging |
| 40 |
jhgan00/ko-sentence-transformers
한국어 사전학습 모델을 활용한 문장 임베딩 |
|
Emerging |
| 41 |
Brokenwind/BertSimilarity
Computing similarity of two sentences with google's BERT... |
|
Emerging |
| 42 |
soskek/bert-chainer
Chainer implementation of "BERT: Pre-training of Deep Bidirectional... |
|
Emerging |
| 43 |
lonePatient/electra_pytorch
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators |
|
Emerging |
| 44 |
lgessler/microbert
A tiny BERT for low-resource monolingual models |
|
Emerging |
| 45 |
indiejoseph/chinese-char-rnn
Character-Level language models |
|
Experimental |
| 46 |
tunib-ai/tunib-electra
Korean-English Bilingual Electra Models |
|
Experimental |
| 47 |
taishan1994/pytorch-distributed-NLP
pytorch分布式训练 |
|
Experimental |
| 48 |
jeffchy/RE2RNN
Source code for the EMNLP 2020 paper "Cold-Start and Interpretability:... |
|
Experimental |
| 49 |
nishiwen1214/GLUE-bert4keras
基于bert4keras的GLUE基准代码 |
|
Experimental |
| 50 |
yulleyi/bert-kanji-graph
Using LLMs and graph algorithms to understand the semantics of Japanese Kanji |
|
Experimental |
| 51 |
richarddwang/electra_pytorch
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the... |
|
Experimental |
| 52 |
JosselinSomervilleRoberts/BERT-Multitask-learning
Multitask-learning of a BERT backbone. Allows to easily train a BERT model... |
|
Experimental |
| 53 |
HuBoren99/SmartBert
The implementation of SmartBERT: A Promotion of Dynamic Early Exiting... |
|
Experimental |
| 54 |
kryvokhyzha/how-to-understand-bert
In this repository, I have collected different sources, visualizations, and... |
|
Experimental |
| 55 |
tugstugi/mongolian-bert
Pre-trained Mongolian BERT models |
|
Experimental |
| 56 |
CLUEbenchmark/LightLM
高性能小模型测评 Shared Tasks in NLPCC 2020. Task 1 - Light Pre-Training Chinese... |
|
Experimental |
| 57 |
muhwagua/color-bert
ColorBERT: Colorizing BERT's perception of the world for visual downstream tasks |
|
Experimental |
| 58 |
shibing624/text2vec-service
Service for Bert model to Vector.... |
|
Experimental |
| 59 |
kuzgnlar/models
Pre-trained Electra models specific to Q&A, NER and Sentiment Analysis tasks |
|
Experimental |
| 60 |
MalteHB/-l-ctra
Ælæctra was created as part of a Cognitive Science bachelor thesis, in the... |
|
Experimental |
| 61 |
StefenSal/Bert-Tokens-Tools
A useful tools to handle multi-lingual tokens when you use Bert. |
|
Experimental |
| 62 |
snunlp/KR-BERT
KoRean based BERT pre-trained models (KR-BERT) for Tensorflow and PyTorch |
|
Experimental |
| 63 |
pysentimiento/robertuito
A pre-trained language model for social media text in Spanish |
|
Experimental |
| 64 |
dh1105/Sentence-Entailment
Benchmarking various Deep Learning models such as BERT, ALBERT, BiLSTMs on... |
|
Experimental |
| 65 |
nguyenvulebinh/vietnamese-electra
Electra pre-trained model using Vietnamese corpus |
|
Experimental |
| 66 |
cbenge509/BERTVision
A parameter-efficient compression model architecture for a variety of NLP... |
|
Experimental |
| 67 |
IceFlameWorm/TextPair
文本对关系比较 - 语义相似度、字面相似度、文本蕴含等等 |
|
Experimental |
| 68 |
nishiwen1214/SuperGLUE-bert4keras
基于bert4keras的SuperGLUE基准代码 |
|
Experimental |
| 69 |
vmware-archive/bert-pretraining
The project is a python module that facilitates BERT pretraining. The... |
|
Experimental |
| 70 |
kipi-ai/korpatbert
특허분야 특화된 한국어 AI언어모델 KorPatBERT |
|
Experimental |
| 71 |
LetianLee/BERT-Jittor
A BERT model built with Jittor | 计图版 BERT 模型 | 计图 NLP 教程 |
|
Experimental |
| 72 |
PengboLiu/Slot-Filling
Spoken Language Understanding(SLU)/Slot Filling(语义槽填充) in PyTorch |
|
Experimental |
| 73 |
SunYanCN/BAND
BAND:BERT Application aNd Deployment, A simple and efficient BERT model... |
|
Experimental |
| 74 |
wisdomify/wisdomify
A BERT-based reverse dictionary of Korean proverbs |
|
Experimental |
| 75 |
lonePatient/bert-sentence-similarity-pytorch
This repo contains a PyTorch implementation of a pretrained BERT model for... |
|
Experimental |
| 76 |
felicitywang/TFMTL
A TensorFlow Framework for Multi-Task Learning and Text Classification |
|
Experimental |
| 77 |
mhardalov/bg-reason-BERT
Beyond English-Only Reading Comprehension: Experiments in Zero-Shot... |
|
Experimental |
| 78 |
sunyilgdx/RoBERTa4Keras
An English RoBERTa based on bert4keras |
|
Experimental |
| 79 |
ImperialNLP/BertGen
Training and evaluation codes for the BertGen paper (ACL-IJCNLP 2021) |
|
Experimental |
| 80 |
lanwuwei/GigaBERT
Zero-shot Transfer Learning from English to Arabic |
|
Experimental |
| 81 |
sayakpaul/BERT-for-Mobile
Compares the DistilBERT and MobileBERT architectures for mobile deployments. |
|
Experimental |
| 82 |
lancopku/CascadeBERT
Code for CascadeBERT, Findings of EMNLP 2021 |
|
Experimental |
| 83 |
himkt/awesome-bert-japanese
📝 A list of pre-trained BERT models for Japanese with word/subword... |
|
Experimental |
| 84 |
kipi-ai/korpatelectra
특허분야 특화된 한국어 AI언어모델 KorPatELECTRA |
|
Experimental |
| 85 |
OctopusMind/longBert
长文本相似度模型 |
|
Experimental |
| 86 |
Akruzen/Briefer
An Android application based on TensorFlow's BERT to perform NLP operations... |
|
Experimental |
| 87 |
ZhenwenZhang/Slot_Filling
Latest research advances on semantic slot filling. |
|
Experimental |
| 88 |
bioinf-mcb/BERT-torchserve-quickstart
Serving BERT embeddings via Torchserve |
|
Experimental |
| 89 |
LegalInsight/PretrainedModel
한국어 법률특화 사전학습모델(BERT) |
|
Experimental |
| 90 |
felixstuart/Megablunder-Prediction-Model
A BERT model fine-tuned to classify 8 grammatical errors. |
|
Experimental |
| 91 |
leeway0507/Bert_For_Domain_Adaptation
Huggingface를 활용해 Bert에 대한 Domain Adaptation 설명 |
|
Experimental |
| 92 |
GaoQ1/bert4pl
Use bert by transformer and pytorch-lightning |
|
Experimental |
| 93 |
sunyilgdx/Prompts4Keras
Prompt-learning methods used BERT4Keras (PET, EFL and NSP-BERT), both for... |
|
Experimental |
| 94 |
syeda434am/Domain-Adaptive-BERT-Pretraining
This repository contains the code for pretraining a BERT model on... |
|
Experimental |
| 95 |
NivAm12/Enhancing-By-Subtasks-Components
This project aims to tackle data scarcity in a specific task by training a... |
|
Experimental |
| 96 |
aroagomez/STS_NLP_Transformers
Proyecto de Procesamiento del Lenguaje Natural (PLN) para evaluar la... |
|
Experimental |
| 97 |
shiva-s936/ELMo-Deep-Contextualized-Word-Representations
An implementation of ELMo embeddings using PyTorch, featuring stacked... |
|
Experimental |
| 98 |
devjwsong/dialogue-sentence-bert-pytorch
DialogueSentenceBERT: SentenceBERT for More Representative Utterance... |
|
Experimental |
| 99 |
Sami9166/ELMo_Experiment
ELMo Performance Experiment using Pytorch |
|
Experimental |
| 100 |
MasoudKargar/RBMD
RBMD: RoBERTa-Based Module Detection in Multi-Programming Language Software Systems |
|
Experimental |
| 101 |
ervin-kiose/citation-link-prediction
Academic paper link prediction using TF-IDF, Sentence-BERT, Doc2Vec, and LDA... |
|
Experimental |
| 102 |
W6WM9M/MA-BERT
Pretrained checkpoint for MA-BERT |
|
Experimental |
| 103 |
ayperiKhudaybergenova/bert-distilbert-comparison-WNLI-NER
⚙️Comparison of Transformer-based Language Models |
|
Experimental |
| 104 |
rplacucci/BERT
Full PyTorch implementation of BERT: Pre-training of Deep Bidirectional... |
|
Experimental |
| 105 |
nasrin-taghizadeh/SinaBERT
Data and codes for SinaBERT Language Model |
|
Experimental |
| 106 |
mmdjiji/bert-chinese-idioms
A Chinese idiom recommendation system based on BERT pre-training language model. |
|
Experimental |
| 107 |
ramiyappan/BERT
Implemented BERT from scratch in PyTorch framework using Stanford Sentiment... |
|
Experimental |
| 108 |
andreped/NLP-MTL
Training neural networks to solve multiple tasks simultaneously from free... |
|
Experimental |
| 109 |
Manas02/FragmentBERT
Masked Language Models are Fragment Based Drug Designers |
|
Experimental |