Codex CLI v0.75.0+ Integration Guide
Overview
This document outlines the integration with OpenAI Codex CLI v0.75.0+, highlighting breaking changes, new features, and implementation details for the MCP server wrapper.
Version Compatibility
Recommended Version: v0.75.0+
This MCP server is optimized for codex CLI v0.75.0 or later for full feature support.
Version History:
v0.75.0: Added
codex exec reviewcommand, sandbox modes, full-auto modev0.74.0: Introduced
gpt-5.2-codexmodelv0.50.0: Introduced
--skip-git-repo-checkflag, removed--reasoning-effortflagv0.36.0-v0.49.x: Not compatible with this MCP server version (use older MCP releases)
Breaking Changes ⚠️
v0.75.0 Changes (Current)
Resume command moved under exec
Old:
codex resume <id>New:
codex exec resume <id>Impact: MCP server updated to use new command structure
v0.50.0 Changes
--skip-git-repo-checkflag now requiredRequired when running outside git repositories or in untrusted directories
Prevents "Not inside a trusted directory" errors
Impact: All MCP server commands now include this flag
--reasoning-effortflag changedThe standalone flag was removed in codex CLI v0.50.0
Now passed via
-c model_reasoning_effort=<level>config flagImpact: MCP server updated to use new config-based approach
v0.36.0 Changes (Historical)
Authentication Method Change
Old Method:
OPENAI_API_KEYenvironment variableNew Method:
codex login --api-key "your-api-key"Storage: Credentials now stored in
CODEX_HOME/auth.jsonImpact: Users must re-authenticate using the new login command
New Features Implemented
1. Code Review (v0.75.0+)
Command:
codex exec reviewCLI Flags:
--uncommitted: Review staged, unstaged, and untracked changes--base <branch>: Review changes against a base branch--commit <sha>: Review changes introduced by a specific commit--title <title>: Optional title for review summary
MCP Tool: New
reviewtool with all parameters exposed
2. Sandbox Mode (v0.75.0+)
CLI Flag:
--sandbox <mode>Modes:
read-only: No file writes allowedworkspace-write: Writes only in workspace directorydanger-full-access: Full system access (dangerous)
MCP Parameter:
sandboxparameter in codex tool
3. Full-Auto Mode (v0.75.0+)
CLI Flag:
--full-autoDescription: Sandboxed automatic execution without approval prompts
Equivalent to:
-a on-request --sandbox workspace-writeMCP Parameter:
fullAutoboolean parameter
4. Working Directory (v0.75.0+)
CLI Flag:
-C <dir>Description: Set working directory for the agent
MCP Parameter:
workingDirectoryparameter in both codex and review tools
5. Model Selection
Default Model:
gpt-5.2-codex(optimal for agentic coding tasks)CLI Flag:
--model <model-name>Supported Models:
gpt-5.2-codex(default, specialized for agentic coding)gpt-5.1-codex(previous coding model)gpt-5.1-codex-max(enhanced coding)gpt-5-codex(base GPT-5 coding)gpt-4o(fast multimodal)gpt-4(advanced reasoning)o3(OpenAI reasoning model)o4-mini(compact reasoning model)
Usage: Model parameter available in
exec,resume, andreviewmodes
6. Reasoning Effort Control
CLI Flag:
-c model_reasoning_effort="<level>"Levels:
minimal,low,medium,highMCP Parameter:
reasoningEffortparameter in codex toolNote: The standalone
--reasoning-effortflag was removed in v0.50.0, now uses quoted config values for consistency
7. Native Resume Functionality
Command:
codex exec resume <conversation-id>Automatic ID Extraction: Server extracts conversation IDs from CLI output
Regex Pattern:
/conversation\s*id\s*:\s*([a-zA-Z0-9-]+)/iFallback Strategy: Manual context building when resume unavailable
Session Integration: Seamless integration with session management
Features Not Yet Supported
The following Codex CLI features are not currently exposed through the MCP server:
Image Attachments
-i, --image
Attach images to prompts
OSS/Local Models
--oss, --local-provider
LMStudio/Ollama support
Config Profiles
-p, --profile
Named configuration profiles
Approval Policy
-a, --ask-for-approval
Fine-grained approval control
Web Search
--search
Enable web search tool
Additional Dirs
--add-dir
Extra writable directories
JSON Output
--json
JSONL event stream output
Output Schema
--output-schema
Structured JSON output
Output File
-o, --output-last-message
Write response to file
These features may be added in future versions based on user demand.
Implementation Details
Command Construction (v0.75.0+)
IMPORTANT: All exec options (--model, -c, --skip-git-repo-check, -C, --sandbox, --full-auto) must come BEFORE subcommands (resume, review).
Important: Resume Mode Limitations
The codex exec resume subcommand has a limited set of flags compared to codex exec:
✅
-c, --config- Configuration overrides (use for model selection)✅
--enable/--disable- Feature toggles❌
--model- Not available (use-c model="..."instead)❌
--sandbox- Not available in resume mode❌
--full-auto- Not available in resume mode❌
-C- Not available in resume mode⚠️
--skip-git-repo-check- Must be placed onexeccommand BEFOREresumesubcommand
Important: Review Mode Limitations
The codex exec review subcommand also has limited flags:
✅
-c, --config- Configuration overrides (use for model selection)✅
--uncommitted,--base,--commit,--title- Review-specific flags✅
--enable/--disable- Feature toggles❌
--model- Not available (use-c model="..."instead)❌
--sandbox- Not available in review mode❌
--full-auto- Not available in review mode⚠️
-C- Must be placed onexeccommand BEFOREreviewsubcommand
Key Changes in v0.75.0:
Added:
codex exec reviewsubcommand for code reviewsAdded:
--sandboxflag for sandbox modes (exec only)Added:
--full-autoflag for automatic execution (exec only)Changed:
codex resumemoved tocodex exec resumeNote: Resume and review subcommands have limited flag support
Key Changes in v0.50.0:
Added:
--skip-git-repo-checkflag (exec only)Changed:
--reasoning-effortto-c model_reasoning_effort=<level>
Conversation ID Extraction
Error Handling Enhancements
Authentication Errors: Clear messaging for login requirement
Model Validation: Graceful handling of invalid model names
Network Issues: Proper error propagation and user feedback
CLI Availability: Detection of missing Codex CLI installation
Migration Guide
For Existing Users (Upgrading to v0.75.0+)
Check Current Version:
Update Codex CLI (if below v0.75.0):
Verify Version (must be v0.75.0 or later):
Test New Features:
For New Users
Install Codex CLI (v0.75.0+):
Verify Version:
Authenticate:
Configure (Optional):
Test Setup:
Performance Optimizations
Smart Model Selection
Default to gpt-5.2-codex: Optimal for agentic coding without configuration
Context-Aware Suggestions: Better model recommendations based on task type
Consistent Experience: Same model across session interactions
Efficient Context Management
Native Resume Priority: Use Codex's built-in conversation continuity
Fallback Context: Only when native resume unavailable
Token Optimization: Minimal context overhead for better performance
Error Recovery
Graceful Degradation: Continue operation despite CLI issues
Automatic Retry: For transient network issues
Clear Error Messages: Actionable feedback for user troubleshooting
Testing Strategy
Integration Testing
CLI Command Validation: Verify correct parameter passing
Conversation ID Extraction: Test various output formats
Error Scenario Handling: Comprehensive failure mode coverage
Edge Case Coverage
Malformed CLI Output: Handle unexpected response formats
Network Interruptions: Graceful handling of connectivity issues
Model Availability: Handle model deprecation or unavailability
Best Practices
For Developers
Always specify model explicitly when behavior consistency is critical
Use appropriate reasoning effort based on task complexity
Implement proper error handling for CLI interactions
Monitor session lifecycle to prevent memory leaks
For Users
Start with default settings for optimal experience
Use sessions for complex tasks requiring multiple interactions
Choose reasoning effort wisely to balance speed and quality
Keep CLI updated for latest features and bug fixes
Troubleshooting
Common Issues
Authentication Failures
Solution: Run
codex login --api-key "your-key"Verify: Check
CODEX_HOME/auth.jsonexists
Model Not Available
Solution: Use default
gpt-5.2-codexor try alternative modelsCheck: Codex CLI documentation for available models
Resume Functionality Not Working
Solution: System falls back to manual context building
Check: Conversation ID extraction in server logs
Performance Issues
Solution: Lower reasoning effort or use faster models
Monitor: Response times and adjust parameters accordingly
Last updated
Was this helpful?