illiterate/BertClassifier

基于PyTorch的BERT中文文本分类模型(BERT Chinese text classification model implemented by PyTorch)

42
/ 100
Emerging

Leverages Hugging Face's transformers library to extract sentence embeddings from pretrained BERT, feeding the [CLS] token representation through a linear classification layer and softmax for 10-category Chinese news classification. Achieves 0.92 accuracy on the THUCNews dataset (50k training samples across sports, entertainment, real estate, education, and other domains). Designed as an educational implementation with straightforward architecture suitable for NLP beginners to understand BERT fine-tuning fundamentals.

203 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

203

Forks

25

Language

Python

License

MIT

Last pushed

Mar 17, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/illiterate/BertClassifier"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.