automl/promptolution
A unified, modular Framework for Prompt Optimization
Supports multiple state-of-the-art prompt optimization algorithms (CAPO, EvoPrompt, OPRO) with a unified LLM backend spanning API-based models, local inference via vLLM/transformers, and cluster deployments. Built-in response caching, parallelized inference, and detailed token tracking enable cost-efficient, reproducible large-scale experiments. Decomposes optimization into modular components—Task, Predictor, LLM, and Optimizer—allowing researchers to customize any stage without rigid abstractions.
114 stars.
Stars
114
Forks
8
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/automl/promptolution"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
linshenkx/prompt-optimizer
一款提示词优化器,助力于编写高质量的提示词
Undertone0809/promptulate
🚀Lightweight Large language model automation and Autonomous Language Agents development...
CTLab-ITMO/CoolPrompt
Automatic Prompt Optimization Framework
microsoft/sammo
A library for prompt engineering and optimization (SAMMO = Structure-aware Multi-Objective...
Eladlev/AutoPrompt
A framework for prompt tuning using Intent-based Prompt Calibration