Promptify and promptmage

These are complementary tools: Promptify provides structured output extraction and prompt versioning for individual LLM calls, while Promptmage orchestrates multiple LLM interactions into workflows—you'd use Promptify within Promptmage pipelines to ensure consistent, typed outputs across workflow steps.

Promptify
74
Verified
promptmage
45
Emerging
Maintenance 16/25
Adoption 14/25
Maturity 25/25
Community 19/25
Maintenance 0/25
Adoption 9/25
Maturity 25/25
Community 11/25
Stars: 4,572
Forks: 361
Downloads: 62
Commits (30d): 2
Language: Python
License: Apache-2.0
Stars: 114
Forks: 8
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Dependents
Stale 6m

About Promptify

promptslab/Promptify

Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research

Provides task-specific NLP classes (NER, classification, QA, relation extraction) that return type-safe Pydantic models instead of raw text, eliminating parsing brittleness. Abstracts away LLM provider differences through LiteLLM, allowing seamless switching between OpenAI, Anthropic, Ollama, and 100+ other backends with a single model string. Includes built-in evaluation metrics (precision, recall, F1) and cost tracking, plus batch/async processing for production workloads.

About promptmage

tsterbak/promptmage

simplifies the process of creating and managing LLM workflows.

Provides built-in prompt versioning, A/B testing, and comparison capabilities alongside an interactive playground for iteration. Generates a FastAPI-based REST API automatically with type-hint inference, supports both local and remote server deployment, and integrates testing/validation directly into the workflow development cycle.

Scores updated daily from GitHub, PyPI, and npm data. How scores work