langwatch prompt command provides dependency management for AI prompts as plain YAML files, enabling you to version prompts locally with Git while synchronizing with the LangWatch platform for testing, evaluation, and team collaboration.
Installation
Install the CLI globally:Quick Start
1. Initialize Your Project
Create a new prompts project:2. Add Your First Prompt
Create a local prompt:prompts/ directory, and look like this:
3. Synchronize
Sync all prompts (fetch remote, push local changes):Using the Prompts
Once you’ve created your prompts, you can use them in your application code. The LangWatch SDK provides a simple interface to fetch and compile prompts with dynamic variables.- Python
- TypeScript
Working with Local and Remote Prompts
The SDK loads prompts dynamically at runtime, so you don’t need to worry about whether they’re local or remote:- Local prompts: Fetched directly from your LangWatch project
- Remote prompts: Also fetched from LangWatch after being synced from the materialized files
For production deployments, the CLI materializes prompts locally so you have a complete snapshot of all dependencies. The SDK can automatically use these materialized prompts for guaranteed availability in offline or air-gapped environments. However, when online, the SDK always fetches the latest version from the server unless you specify a version.Learn more about guaranteed availability and offline deployments.
Loading Specific Versions
You can also load specific prompt versions instead of always using the latest:Prompt Variables
Prompts use{{variable_name}} syntax for dynamic content. When you compile a prompt, provide all required variables:
my-prompt.prompt.yaml
The compiled prompt contains fully rendered messages ready to send to your LLM provider, with all variables replaced and the correct model configuration applied.
Core Concepts
Dependency Management
The CLI uses two configuration files:prompts.json - Declares your prompt dependencies:
prompts-lock.json - Tracks resolved versions and materialized file paths:
Local vs Remote Prompts
Remote Prompts (agent/customer-service@latest)
- Pulled from LangWatch platform
- Fetched and materialized locally in
./prompts/.materialized/ - Read-only locally
file:./prompts/my-prompt.prompt.yaml)
- Stored as local YAML files
- Version controlled with Git
- Pushed to platform during sync for sharing and evaluation
YAML Format
Prompts files end with.prompt.yaml extension and follow this format:
Commands Reference
langwatch prompt init
Initialize a new prompts project in the current directory.
langwatch prompt add <spec> [localFile]
Add a new prompt dependency and immediately fetch/materialize it.
<spec>- Prompt specification (name@version or name for latest)[localFile]- Optional path to local YAML file to add
- Updates
prompts.jsonwith new dependency - Fetches prompt from server and materializes locally
- Updates
prompts-lock.jsonwith resolved version
langwatch prompt remove <name>
Remove a prompt dependency and clean up associated files.
- Removes entry from
prompts.json - Removes entry from
prompts-lock.json - Deletes materialized file
- For local prompts: deletes source file and warns about server state
langwatch prompt create <name>
Create a new local prompt file with default content.
- Creates
./prompts/<name>.prompt.yamlwith template content - Automatically adds to
prompts.jsonasfile:dependency - Updates
prompts-lock.json
langwatch prompt sync
Synchronize all prompts between local files and the server.
- Fetches remote prompts if new versions available
- Pushes local prompt changes to server
- Handles conflict resolution interactively
- Cleans up orphaned materialized files
- Reports what was synced
langwatch prompt list
Display current prompt dependencies and their status.
CI/CD Integration
Integrate prompt materialization into your deployment pipeline:.github/workflows/deploy.yml
Workflows
Team Collaboration
Setup:- One team member initializes project with
langwatch prompt init - Commit
prompts.jsonandprompts-lock.jsonto Git - Add
prompts/.materializedto.gitignore - Team members run
langwatch prompt syncafter pulling
Version Management
Pinning Versions:Coding Assistant Integration
Since prompts are just YAML files, you refer to them directly from other tools or coding assistants.Cursor Integration
Reference prompts in a.cursor/rules/*.mdc file:
Cloud Code Integration
Include prompt content in cloud development environments by referencing the YAML files in theprompts/.materialized directory.