ucinlp/autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
Uses gradient-guided search to automatically discover trigger token sequences that prompt masked language models (BERT, RoBERTa) to perform downstream tasks without task-specific fine-tuning. Supports sentiment analysis, natural language inference, fact retrieval, and relation extraction through customizable prompt templates with placeholder substitution. Integrates with the LAMA framework for knowledge probing evaluation and provides utilities for automatic label token selection via discrete optimization.
641 stars. No commits in the last 6 months.
Stars
641
Forks
86
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 24, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/ucinlp/autoprompt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
THUDM/P-tuning-v2
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
zjunlp/KnowPrompt
[WWW 2022] KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation...
zjunlp/PromptKG
PromptKG Family: a Gallery of Prompt Learning & KG-related research works, toolkits, and paper-list.
VE-FORBRYDERNE/mtj-softtuner
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab...
princeton-nlp/OptiPrompt
[NAACL 2021] Factual Probing Is [MASK]: Learning vs. Learning to Recall https://arxiv.org/abs/2104.05240