zyds/transformers-code

手把手带你实战 Huggingface Transformers 课程视频同步更新在B站与YouTube

40
/ 100
Emerging

Covers comprehensive Transformers training workflows across foundational components (Pipeline, Tokenizer, Model, Datasets, Trainer) and nine NLP task implementations (NER, machine reading comprehension, text similarity, dialogue systems, summarization). Integrates PEFT library for parameter-efficient fine-tuning methods (LoRA, QLoRA, Prompt-tuning, P-tuning), bitsandbytes for 4/8-bit quantization training, and Accelerate for distributed training with DeepSpeed support. Targets PyTorch 2.2+ with compatible versions of transformers 4.42+, datasets, and evaluation frameworks.

3,853 stars. No commits in the last 6 months.

No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 22 / 25

How are scores calculated?

Stars

3,853

Forks

504

Language

Jupyter Notebook

License

Last pushed

Jul 15, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/zyds/transformers-code"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.