When I discovered Google's Dotprompt format (frontmatter + Handlebars templates), I realized it was perfect for something I'd been wanting: treating prompts as first-class programs you can pipe together Unix-style. Google uses Dotprompt in Firebase Genkit and I wanted something simpler - just run a .prompt file directly on the command line.
Here's what it looks like:
--- model: anthropic/claude-sonnet-4-20250514 output: format: json schema: sentiment: string, positive/negative/neutral confidence: number, 0-1 score --- Analyze the sentiment of: {{STDIN}}
Running it:
cat reviews.txt | ./runprompt sentiment.prompt | jq '.sentiment'
The things I think are interesting:
* Structured output schemas: Define JSON schemas in the frontmatter using a simple `field: type, description` syntax. The LLM reliably returns valid JSON you can pipe to other tools.
* Prompt chaining: Pipe JSON output from one prompt as template variables into the next. This makes it easy to build multi-step agentic workflows as simple shell pipelines.
* Zero dependencies: It's a single Python file that uses only stdlib. Just curl it down and run it.
* Provider agnostic: Works with Anthropic, OpenAI, Google AI, and OpenRouter (which gives you access to dozens of models through one API key).
You can use it to automate things like extracting structured data from unstructured text, generating reports from logs, and building small agentic workflows without spinning up a whole framework.
Would love your feedback, and PRs are most welcome!
I've been using mlflow to store my prompts, but wanted something lightweight on the cli to version and manage prompts. I setup pmp so you can have different storage backends (file, sqlite, mlflow etc.).
I wasn't aware of dotprompt, I might build that in too.
I wasn't aware of the whole ".prompt" format, but it makes a lot of sense.
Very neat. These are the kinds of tools I love to see. Functional and useful, not trying to be "the next big thing".
"Chain Prompts Like Unix Tools with Dotprompt"
https://pythonic.ninja/blog/2025-11-27-dotprompt-unix-pipes/
"One-liner code review from staged changes" - love this example.
seems like it would be, just swap the openai url here or add a new one
https://microsoft.github.io/promptflow/how-to-guides/develop...
#!/bin/bash
file="$1"
model=$(sed -n '2p' "$file" | sed 's/^# \*//')
prompt=$(tail -n +3 "$file")
curl -s https://api.anthropic.com/v1/messages \
-H "x-api-key: $ANTHROPIC_API_KEY" \
-H "content-type: application/json" \
-H "anthropic-version: 2023-06-01" \
-d "{
\"model\": \"$model\",
\"max_tokens\": 1024,
\"messages\": [{\"role\": \"user\", \"content\": $(echo "$prompt" | jq -Rs .)}]
}" | jq -r '.content[0].text'
hello.prompt #!/usr/local/bin/promptrun
# claude-sonnet-4-20250514
Write a haiku about terminal commands.If you curl/wget a script, you still need to chmod +x it. Git doesn't have this issue as it retains the file metadata.
#!/bin/env runprompt
---
.frontmatter...
---
The prompt.
Would be a lot nicer, as then you can just +x the prompt file itself.