Promptify and promptml-cli

One is a full-fledged prompt engineering framework with versioning and structured output capabilities, while the other is a command-line interface for executing a prompt markup language, suggesting they could be complementary, with the CLI potentially integrating with or being used to manage prompts defined within the broader framework.

Promptify
74
Verified
promptml-cli
40
Emerging
Maintenance 16/25
Adoption 14/25
Maturity 25/25
Community 19/25
Maintenance 6/25
Adoption 9/25
Maturity 18/25
Community 7/25
Stars: 4,572
Forks: 361
Downloads: 62
Commits (30d): 2
Language: Python
License: Apache-2.0
Stars: 11
Forks: 1
Downloads: 35
Commits (30d): 0
Language: Python
License: MIT
No Dependents
No risk flags

About Promptify

promptslab/Promptify

Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research

Provides task-specific NLP classes (NER, classification, QA, relation extraction) that return type-safe Pydantic models instead of raw text, eliminating parsing brittleness. Abstracts away LLM provider differences through LiteLLM, allowing seamless switching between OpenAI, Anthropic, Ollama, and 100+ other backends with a single model string. Includes built-in evaluation metrics (precision, recall, F1) and cost tracking, plus batch/async processing for production workloads.

About promptml-cli

narenaryan/promptml-cli

A CLI application to run Prompt Markup Language scripts

Executes structured prompt templates (`.pml` files) against multiple LLM providers—OpenAI, Google GenAI, and Ollama—with configurable serialization formats (XML, JSON, YAML) and streaming or buffered response modes. Supports piping and file output via raw mode, enabling integration into shell workflows. Built with pluggable provider architecture for extensibility to additional model providers.

Scores updated daily from GitHub, PyPI, and npm data. How scores work