Knowledge Distillation Frameworks Prompt Engineering Tools
There are 7 knowledge distillation frameworks tools tracked. The highest-rated is thunlp/PromptPapers at 38/100 with 4,296 stars.
Get all 7 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=prompt-engineering&subcategory=knowledge-distillation-frameworks&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Tool | Score | Tier |
|---|---|---|---|
| 1 |
thunlp/PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models. |
|
Emerging |
| 2 |
google-research/prompt-tuning
Original Implementation of Prompt Tuning from Lester, et al, 2021 |
|
Emerging |
| 3 |
ZhangYuanhan-AI/NOAH
[TPAMI] Searching prompt modules for parameter-efficient transfer learning. |
|
Emerging |
| 4 |
gmkim-ai/PromptKD
An official implementation of "PromptKD: Distilling Student-Friendly... |
|
Experimental |
| 5 |
zhengzangw/DoPrompt
Official implementation of PCS in essay "Prompt Vision Transformer for... |
|
Experimental |
| 6 |
Hzfinfdu/MPMP
ACL'2023: Multi-Task Pre-Training of Modular Prompt for Few-Shot Learning |
|
Experimental |
| 7 |
youngjae-cho/APP
Official PyTorch implementation for Make Prompts Adaptable: Bayesian... |
|
Experimental |