Promptify and promptml-cli
One is a full-fledged prompt engineering framework with versioning and structured output capabilities, while the other is a command-line interface for executing a prompt markup language, suggesting they could be complementary, with the CLI potentially integrating with or being used to manage prompts defined within the broader framework.
About Promptify
promptslab/Promptify
Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research
Provides task-specific NLP classes (NER, classification, QA, relation extraction) that return type-safe Pydantic models instead of raw text, eliminating parsing brittleness. Abstracts away LLM provider differences through LiteLLM, allowing seamless switching between OpenAI, Anthropic, Ollama, and 100+ other backends with a single model string. Includes built-in evaluation metrics (precision, recall, F1) and cost tracking, plus batch/async processing for production workloads.
About promptml-cli
narenaryan/promptml-cli
A CLI application to run Prompt Markup Language scripts
Executes structured prompt templates (`.pml` files) against multiple LLM providers—OpenAI, Google GenAI, and Ollama—with configurable serialization formats (XML, JSON, YAML) and streaming or buffered response modes. Supports piping and file output via raw mode, enabling integration into shell workflows. Built with pluggable provider architecture for extensibility to additional model providers.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work