# Install & Configure: Open Codex: A Local-First, Open-Source Command Line AI Assistant
## What This Is
Open Codex is a command-line AI assistant for developers that runs entirely on local machines, ensuring data privacy and eliminating API costs. It translates natural language instructions into terminal commands, speeding up development workflows without sending code to the cloud.
Source: https://github.com/codingmoh/open-codex
## Before You Start
Scan my workspace and analyze:
- The project language, framework, and directory structure
- Existing agent configuration (check for .claude/, .codex/, CLAUDE.md, settings.json, commands/, skills/ directories)
- Whether this repository or a similar tool is already cloned or installed
Then ask me before proceeding:
1. Do I need to configure any service credentials for this plugin (database, API keys, etc.)?
2. Should this be project-scoped or global?
## Fetch the Source
Clone or inspect the repository to understand what needs to be installed:
```bash
gh repo clone codingmoh/open-codex
```
Review the README, directory structure, and any install instructions before proceeding.
## What to Implement
This is an **Agent Plugin** — typically an MCP server or capability bundle.
- Install any required npm/pip packages from the source repository
- Register the MCP server in `.claude/settings.json` under the `mcpServers` key with the correct command and args
- Configure any required environment variables or credentials
- Verify the server starts correctly by checking its expected output
## Additional Context
- Install the Open Codex tool into the user's environment using the cross-platform pipx method by running `pipx install open-codex`. If pipx is not available, fall back to the appropriate installation method for the user's OS (Homebrew for macOS, pip for others).
- Verify the installation by running a simple command, such as `open-codex --help`, and confirm that the tool is accessible in the system's PATH.
- Check for a running Ollama service on the user's local machine. If detected, demonstrate the tool's core functionality by executing a command with an Ollama model, like `open-codex --ollama --model llama3 "list all python files modified in the last 24 hours"`. If Ollama is not running, inform the user about the enhanced capabilities available with it.
## Reference Implementation
```
open-codex --ollama --model llama3 "find all JPEGs larger than 10MB"
```
## Guidelines
- Adapt everything to my existing project — do not assume a specific stack or directory layout
- Review any fetched code for safety before installing or executing it
- After setup, run a quick verification and show me a summary of exactly what was installed, where, and how to use it