datawhalechina/learn-nlp-with-transformers
we want to create a repo to illustrate usage of transformers in chinese
Structured around foundational theory and applied practice, the curriculum progresses from attention mechanisms and transformer architectures (with PyTorch implementations) through BERT and GPT variants, then to downstream tasks including text classification, sequence tagging, extractive QA, and generation tasks like machine translation and summarization. Leverages the Hugging Face Transformers library as the primary implementation framework, with all content and examples delivered in Chinese to lower barriers for Mandarin-speaking NLP practitioners.
3,143 stars. No commits in the last 6 months.
Stars
3,143
Forks
499
Language
Shell
License
—
Category
Last pushed
Aug 18, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/datawhalechina/learn-nlp-with-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper