# Install & Configure: Codex CLI: AI-Powered Local Terminal Agent
## What This Is
OpenAI's Codex CLI is a tool developers can install on their computers to use AI for coding tasks directly in their terminal. It allows them to automate programming workflows, like creating code, fixing bugs, and managing files, using simple text commands instead of manual effort.
Source: https://yuv.ai/blog/openai-codex-cli-ai-powered-terminal-is-here
## Before You Start
Scan my workspace and analyze:
- The project language, framework, and directory structure
- Existing agent configuration (check for .claude/, .codex/, CLAUDE.md, settings.json, commands/, skills/ directories)
Then ask me before proceeding:
1. Do I need to configure any service credentials for this plugin (database, API keys, etc.)?
2. Should this be project-scoped or global?
## Source Access Note
The source URL (https://yuv.ai/blog/openai-codex-cli-ai-powered-terminal-is-here) may not be directly accessible from the terminal. Use the Reference Implementation and Additional Context sections below instead. If you need more details, ask me to paste relevant content from the source.
## What to Implement
This is an **Agent Plugin** — typically an MCP server or capability bundle.
- Install any required npm/pip packages from the source repository
- Register the MCP server in `.claude/settings.json` under the `mcpServers` key with the correct command and args
- Configure any required environment variables or credentials
- Verify the server starts correctly by checking its expected output
## Additional Context
- Install the Codex CLI tool globally using the command `npm i -g @openai/codex` and verify the installation.
- Execute the `codex` command to initiate the first-run authentication. Scan the user's environment for an existing `OPENAI_API_KEY` to use for authentication; if not found, prompt the user to provide one.
- For offline capabilities, check for a local Ollama installation. If present, configure the Codex CLI for local mode by executing `codex config set local_mode true` and `codex config set ollama_model codestral:22b`.
## Reference Implementation
```
- uses: openai/codex-action@v1
with:
openai-api-key: ${{ secrets.OPENAI_API_KEY }}
prompt-file: .github/codex/prompts/review.md
safety-strategy: drop-sudo
```
## Guidelines
- Adapt everything to my existing project — do not assume a specific stack or directory layout
- Review any fetched code for safety before installing or executing it
- After setup, run a quick verification and show me a summary of exactly what was installed, where, and how to use it