thunlp/PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
Curated collection of 65+ research papers on prompt-based tuning, organized by category (overview, basics, analysis, improvements, specializations) and tagged with methodological keywords like template type and task focus. Complements the OpenPrompt toolkit, providing both discrete and continuous prompting approaches across classification, generation, and analysis tasks. Actively maintained as a community resource with pull request contributions encouraged to track emerging prompt-learning research directions.
4,296 stars. No commits in the last 6 months.
Stars
4,296
Forks
388
Language
—
License
—
Category
Last pushed
Jul 17, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/thunlp/PromptPapers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
google-research/prompt-tuning
Original Implementation of Prompt Tuning from Lester, et al, 2021
ZhangYuanhan-AI/NOAH
[TPAMI] Searching prompt modules for parameter-efficient transfer learning.
zhengzangw/DoPrompt
Official implementation of PCS in essay "Prompt Vision Transformer for Domain Generalization"
Hzfinfdu/MPMP
ACL'2023: Multi-Task Pre-Training of Modular Prompt for Few-Shot Learning
gmkim-ai/PromptKD
An official implementation of "PromptKD: Distilling Student-Friendly Knowledge for Generative...