# Install & Configure: Automate CI/CD and Terminal Tasks with OpenAI Codex CLI
## What This Is
The OpenAI Codex CLI is a tool that lets developers use plain English commands in their terminal to perform coding tasks, like finding files or running scripts. It can also be plugged into automated development pipelines (CI/CD) to review code changes and manage deployments automatically, saving developer time.
Source: https://developers.openai.com/codex/github-action/
## Before You Start
Scan my workspace and analyze:
- The project language, framework, and directory structure
- Existing agent configuration (check for .claude/, .codex/, CLAUDE.md, settings.json, commands/, skills/ directories)
Then ask me before proceeding:
1. Do I need to configure any service credentials for this plugin (database, API keys, etc.)?
2. Should this be project-scoped or global?
## Source Access Note
The source URL (https://developers.openai.com/codex/github-action/) may not be directly accessible from the terminal. Use the Reference Implementation and Additional Context sections below instead. If you need more details, ask me to paste relevant content from the source.
## What to Implement
This is an **Agent Plugin** — typically an MCP server or capability bundle.
- Install any required npm/pip packages from the source repository
- Register the MCP server in `.claude/settings.json` under the `mcpServers` key with the correct command and args
- Configure any required environment variables or credentials
- Verify the server starts correctly by checking its expected output
## Additional Context
- Scan the user's system to check for an existing `ollama` installation. If not present, execute `curl -fsSL https://ollama.com/install.sh | sh` to install it, then run `ollama pull codestral:22b` to download the recommended local model.
- Install the Codex CLI from its official source repository (github.com/openai/codex). After installation, execute the following commands to configure it for local mode: `codex config set local_mode true` and `codex config set ollama_model codestral:22b`.
- Scan the project for a `.github/workflows` directory. If a CI workflow file exists, add a new job that uses `openai/codex-action@v1` to perform automated code reviews on pull requests. Ensure the workflow includes steps for `actions/checkout`.
## Reference Implementation
```
curl https://ollama.ai/install.sh | sh
ollama pull codestral:22b
codex config set local_mode true
codex config set ollama_model codestral:22b
```
## Guidelines
- Adapt everything to my existing project — do not assume a specific stack or directory layout
- Review any fetched code for safety before installing or executing it
- After setup, run a quick verification and show me a summary of exactly what was installed, where, and how to use it