Quick Start | Documentation | Examples
promptcmd is a manager and executor for programmable prompts. Define a
prompt template once, then execute it like any other terminal command, complete
with argument parsing, --help text, and stdin/stdout integration.
Unlike tools that require you to manually manage prompt files or rely on implicit tool-calling, promptcmd gives you explicit control over what data your models access. Build compositional workflows by nesting prompts, executing shell commands within templates, and piping data through multi-step AI pipelines.
Create a .prompt file, enable it with promptctl, and execute it like any
native tool.
$ promptctl create bashme_a_script_that
$ bashme_a_script_that renames all files in current directly to ".backup"Prepend SSH commands with promptctl, your prompts magically appear in your
remote shell sessions.
$ promptctl ssh user@server
server $ bashme_a_script_that renames all files in current directly to ".backup"Use your Ollama endpoint or configure an API key for OpenAI, OpenRouter, Anthropic, Google, or MiniMax. Swap between them with ease.
$ promptctl create render-md
$ cat README.md | render-md -m openai
$ cat README.md | render-md -m ollama/gpt-oss:20bDistribute requests across several providers with equal or weighted distribution for cost optimization.
# config.toml
[groups.balanced]
providers = ["openai", "google"]$ cat README.md | render-md -m balancedCache responses for a configured amount of time for adding determinism in pipelines and more efficient token consumption.
# config.toml
[providers.openai]
cache_ttl = 60 # number of secondsSet/Override during execution:
$ cat README.md | render-md --config-cache-ttl 120Use Variants to define custom models with own personality or specialization in tasks:
[providers.anthropic]
api_key = "sk-xxxxx"
model = "claude-sonnet-4-5"
[providers.anthropic.glados]
system = "Use sarcasm and offending jokes like the GlaDoS character from Portal."
[providers.anthropic.wheatley]
system = "Reply as if you are Wheatley from Portal."$ tipoftheday -m glados
$ tipoftheday -m wheatleyLinux/macOS:
curl -LsSf https://installer.promptcmd.sh | shmacOS (Homebrew):
brew install tgalal/tap/promptcmdWindows (PowerShell):
powershell -ExecutionPolicy Bypass -c "irm https://installer-ps.promptcmd.sh | iex"Configure your API keys by editing config.toml:
promptctl config editFind your provider's name, e.g., for anthropic:
[providers.anthropic]
api_key = "sk-ant-api03-..."Alternatively, you can set the keys via Environment Variables:
PROMPTCMD_ANTHROPIC_API_KEY="your_api_key"
PROMPTCMD_OPENAI_API_KEY="your_api_key"
PROMPTCMD_ANTHROPIC_API_KEY="your_api_key"
PROMPTCMD_OPENROUTER_API_KEY="your_api_key"
PROMPTCMD_MINIMAX_API_KEY="your_api_key"
Create a summarize.prompt file:
promptctl create summarizeInsert the following:
---
model: anthropic/claude-sonnet-4-5
input:
schema:
words?: integer, Summary length in words
---
Summarize the following text{{#if words}} in {{words}} words{{/if}}:
{{STDIN}}Enable and use it:
# Enable as a command
promptctl enable summarize
# Use it
cat article.txt | summarize
echo "Long text here..." | summarize --words 10
# Auto-generated help
summarize --helpThat's it. Your prompt is now a native command.
Full documentation available at: docs.promptcmd.sh
Browse the Examples directory or visit https//promptcmd.sh/lib for interactive viewing.
GPLv3 License - see LICENSE file for details
