prompty and dotprompt
These are **competitors**: both provide standardized file formats and tooling to manage, version, and execute LLM prompts, with Prompty offering broader lifecycle management (debugging, evaluation) while dotprompt focuses on template execution.
About prompty
microsoft/prompty
Prompty makes it easy to create, manage, debug, and evaluate LLM prompts for your AI applications. Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers.
Prompty defines a standardized YAML-based format for encapsulating prompts with their model configurations, template variables, and execution metadata into portable `.prompty` files. The VS Code extension provides live preview rendering, built-in model switching (supporting Azure OpenAI and OpenAI with AAD/environment variable authentication), and one-click execution with verbose request/response logging. Native integration with Prompt Flow, LangChain, and Semantic Kernel enables seamless orchestration within existing AI frameworks.
About dotprompt
google/dotprompt
Executable GenAI prompt templates
Extends Handlebars templating with GenAI-specific features and metadata, supporting model-agnostic prompt execution across JavaScript/TypeScript, Python, Go, Rust, and Java. Includes IDE plugins (VS Code, JetBrains, Vim, Emacs) and web editors (Monaco, CodeMirror) with Tree-sitter grammar parsing for syntax support. Files contain executable instructions for model invocation, parameter configuration, and prompt behavior beyond template text alone.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work