prompt-optimizer and promptimal

Both are prompt optimization tools; the first is a mature, broadly adopted solution likely offering extensive features and robust performance, while the second is a newer, lightweight, and fast alternative targeting minimal overhead, making them competitive choices with differing design philosophies and target use cases.

prompt-optimizer
72
Verified
promptimal
48
Emerging
Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 21/25
Maintenance 0/25
Adoption 13/25
Maturity 25/25
Community 10/25
Stars: 24,228
Forks: 2,893
Downloads:
Commits (30d): 81
Language: TypeScript
License:
Stars: 300
Forks: 14
Downloads: 17
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
Stale 6m

About prompt-optimizer

linshenkx/prompt-optimizer

一款提示词优化器,助力于编写高质量的提示词

Supports multi-model LLM backends (OpenAI, Gemini, DeepSeek, etc.) with dual optimization modes for system and user prompts, plus advanced testing via context variables, multi-turn sessions, and function calling. Available as web app, desktop client, Chrome extension, Docker container, and MCP server for Claude Desktop integration—with client-side data processing and optional password protection for secure deployment.

About promptimal

shobrook/promptimal

A very fast, very minimal prompt optimizer

Uses a genetic algorithm to iteratively refine prompts by generating candidate variations and scoring them with LLM-as-judge evaluation or custom evaluator functions. Supports hyperparameter tuning (iteration count, population size, termination threshold) and integrates with OpenAI's API by default, with a terminal UI for monitoring optimization progress. Can be extended with custom Python evaluator scripts for dataset-based or task-specific evaluation metrics.

Scores updated daily from GitHub, PyPI, and npm data. How scores work