Skip to content

Codex CLI

Codex CLI is OpenAI's lightweight terminal agent. It runs code inside a sandboxed environment, supports configurable approval policies (from full auto-approve to manual confirmation on every step), and connects to MCP servers through a config.toml file. It is purpose-built for quick coding tasks where safety and simplicity matter more than extensive IDE-like features.


Cx
Codex CLI
TRANSPORT
Streamable HTTP ✓
PLATFORM
Windows · macOS · Linux
MCP VIA
config.toml

Sandboxed Execution with MCP Access

Codex CLI runs generated code inside a sandbox by default — network access is blocked unless you enable it, and file writes are restricted to the project directory. MCP tool calls are treated as controlled external connections, operating through the same approval policies that govern shell commands.

This makes Codex CLI ideal when you want the AI to iterate quickly on code while keeping tight guardrails. The sandbox ensures generated scripts can't cause unintended side effects, while MCP provides the external context (database schemas, API specs, deployment configs) the agent needs to produce accurate code.

Key traits:

  • Sandbox-first — code runs in an isolated environment with explicit permission grants
  • Approval policies — choose suggest, auto-edit, or full-auto modes
  • TOML config — MCP servers and preferences live in ~/.codex/config.toml
  • OpenAI models — optimized for GPT-4o and o-series reasoning models
  • Open-source — MIT license, fast release cycle

How to Connect

1. Create a Vinkius Token

In Vinkius Cloud, go to your server → Connection TokensCreate. Copy the URL.

2. Edit config.toml

Open ~/.codex/config.toml (create it if it doesn't exist) and add:

toml
[mcp_servers.vinkius]
url = "https://edge.vinkius.com/{TOKEN}/mcp"

3. Launch Codex

bash
codex

MCP tools load on startup. The agent uses them whenever the task requires external information, subject to your approval policy.


FAQ

Does the sandbox affect MCP calls? MCP calls go through the Codex approval system. In suggest mode, you confirm each call before it executes. In full-auto, calls run without prompts.

Which OpenAI models work with Codex CLI? GPT-4o is the default. o-series reasoning models (o1, o3) are also supported for complex multi-step tasks.

Can I use non-OpenAI models? Codex CLI is designed for OpenAI models. For other providers, consider Aider or OpenCode.

Is Codex CLI free? Open-source. Requires an OpenAI API key with pay-as-you-go billing.