Promptify and promptml
These are complementary tools: Promptify provides a framework for prompt engineering and versioning with structured output handling, while PromptML offers a markup language specification that could be used as an input format or templating syntax within such a framework.
About Promptify
promptslab/Promptify
Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research
Provides task-specific NLP classes (NER, classification, QA, relation extraction) that return type-safe Pydantic models instead of raw text, eliminating parsing brittleness. Abstracts away LLM provider differences through LiteLLM, allowing seamless switching between OpenAI, Anthropic, Ollama, and 100+ other backends with a single model string. Includes built-in evaluation metrics (precision, recall, F1) and cost tracking, plus batch/async processing for production workloads.
About promptml
narenaryan/promptml
Prompt markup language (A.K.A PromptML) library is specially built for AI systems - from Vidura AI
Provides a structured DSL that decomposes prompts into explicit sections—context, objective, instructions, examples, constraints, and metadata—parsed into standardized data structures. Supports variable interpolation and serialization to XML, YAML, and JSON formats, enabling version control and cross-agent prompt reusability. Integrates with OpenAI and Google models via the companion `promptml-cli` tool.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work