~/.codex/config.toml (user-level) or .codex/config.toml (project-level).
Simple Setup
One config file in your project
OpenAI Compatible
Uses Venice’s OpenAI-compatible API
Model Flexibility
Swap in any supported Venice text model
Prerequisites
- A Venice API key from venice.ai/settings/api
- Codex CLI installed and working on your machine
Setup
Replace the two placeholders
Update:
modelwith the Venice model ID you want to useexperimental_bearer_tokenwith your real Venice API key
Official Codex Config Locations
- User defaults:
~/.codex/config.toml - Project overrides:
.codex/config.toml(loaded only for trusted projects)
~/.codex/config.toml.
Configuration Precedence (Highest First)
- CLI flags and
--configoverrides - Profile values (
--profile <name>) - Project config layers (
.codex/config.toml, closest directory wins) - User config (
~/.codex/config.toml) - System config (
/etc/codex/config.toml, Unix) - Built-in defaults
Notes
- Keep your API key private and never commit real keys to git.
- Codex ignores project
.codex/config when a project is marked untrusted. - If you switch models, only update the
modelfield. - The
wire_api = "responses"setting is required for this provider setup.