# Set Up Workflow: Deploying Multi-Agent AI Systems with OpenCLAW
## What This Is
OpenCLAW is a framework for creating teams of specialized AI agents that work together inside your business. This guide provides a 7-phase roadmap for building, testing, and deploying these agent systems to automate complex tasks, such as synchronizing customer data between different software applications.
Source: https://yu-wenhao.com/en/blog/openclaw-tools-skills-tutorial/
## Before You Start
Scan my workspace and analyze:
- The project language, framework, and directory structure
- Existing AI provider config (check .env, .env.local, config files for API keys — OpenRouter, OpenAI, Anthropic, Google AI, etc.)
Then ask me before proceeding:
1. Which AI provider/API should this use? (Use whatever I already have configured, or ask me to set one up — options include direct provider APIs or a unified service like OpenRouter)
2. Where in my project should this be integrated?
3. Are there any customizations I need (model preferences, naming conventions, constraints)?
## Source Access Note
The source URL (https://yu-wenhao.com/en/blog/openclaw-tools-skills-tutorial/) may not be directly accessible from the terminal. Use the Reference Implementation and Additional Context sections below instead. If you need more details, ask me to paste relevant content from the source.
## What to Implement
This is an **AI Workflow** — an end-to-end automation pattern or integration pipeline.
- Study the workflow architecture from the source and context below
- Identify which parts I can implement locally vs. parts that need external services
- For local parts: implement them using my existing stack and API keys
- For external parts: tell me exactly what services I need and help me configure the integration code
- Wire up any required API calls using keys from my .env files
## Additional Context
- Create a directory `~/openclaw-docker` and inside it, create a `docker-compose.yml` file with the provided service definition for `openclaw/openclaw:latest`.
- Create a `.env` file in the `~/openclaw-docker` directory. Scan the user's existing environment variables for keys like `ANTHROPIC_API_KEY`, `OPENAI_API_KEY`, or `OLLAMA_HOST` and populate the `.env` file accordingly. If no keys are found, prompt the user to add one for their desired LLM provider.
- Execute `docker compose up -d` from the `~/openclaw-docker` directory to build and run the OpenCLAW gateway container in detached mode.
## Reference Implementation
```
mkdir -p ~/openclaw-docker && cd ~/openclaw-docker
cat > docker-compose.yml <<EOF
version: '3.8'
services:
openclaw:
image: openclaw/openclaw:latest
container_name: openclaw
restart: unless-stopped
ports:
- "127.0.0.1:18789:18789"
volumes:
- ./data:/app/data
- ./.env:/app/.env
environment:
- NODE_ENV=production
EOF
docker compose up -d
```
## Guidelines
- Adapt everything to my existing project — do not assume a specific stack or directory layout
- Use whichever AI provider I already have configured; if I need a new one, tell me what to sign up for and I'll give you the key
- Check my .env files for existing API keys (OpenRouter, OpenAI, Anthropic, Google AI) before asking me to add one
- Review any fetched code for safety before installing or executing it
- After setup, run a quick verification and show me a summary of exactly what was installed, where, and how to use it