datawhalechina/learn-nlp-with-transformers

we want to create a repo to illustrate usage of transformers in chinese

41
/ 100
Emerging

Structured around foundational theory and applied practice, the curriculum progresses from attention mechanisms and transformer architectures (with PyTorch implementations) through BERT and GPT variants, then to downstream tasks including text classification, sequence tagging, extractive QA, and generation tasks like machine translation and summarization. Leverages the Hugging Face Transformers library as the primary implementation framework, with all content and examples delivered in Chinese to lower barriers for Mandarin-speaking NLP practitioners.

3,143 stars. No commits in the last 6 months.

No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 23 / 25

How are scores calculated?

Stars

3,143

Forks

499

Language

Shell

License

Last pushed

Aug 18, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/datawhalechina/learn-nlp-with-transformers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.