Knowledge Distillation Frameworks Prompt Engineering Tools

There are 7 knowledge distillation frameworks tools tracked. The highest-rated is thunlp/PromptPapers at 38/100 with 4,296 stars.

Get all 7 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=prompt-engineering&subcategory=knowledge-distillation-frameworks&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Tool Score Tier
1 thunlp/PromptPapers

Must-read papers on prompt-based tuning for pre-trained language models.

38
Emerging
2 google-research/prompt-tuning

Original Implementation of Prompt Tuning from Lester, et al, 2021

36
Emerging
3 ZhangYuanhan-AI/NOAH

[TPAMI] Searching prompt modules for parameter-efficient transfer learning.

30
Emerging
4 gmkim-ai/PromptKD

An official implementation of "PromptKD: Distilling Student-Friendly...

28
Experimental
5 zhengzangw/DoPrompt

Official implementation of PCS in essay "Prompt Vision Transformer for...

26
Experimental
6 Hzfinfdu/MPMP

ACL'2023: Multi-Task Pre-Training of Modular Prompt for Few-Shot Learning

23
Experimental
7 youngjae-cho/APP

Official PyTorch implementation for Make Prompts Adaptable: Bayesian...

20
Experimental