Codex CLI / Codex App
Codex supports two wire protocols: responses and
chat. Choose based on model capability. Once
~/.codex/config.toml and XAI_API_KEY
are configured, Codex App can reuse the same gateway setup. On
Windows, the config path is
%USERPROFILE%\.codex\config.toml.
# ~/.codex/config.toml
model_provider = "xai"
model = "gpt-5.4"
model_reasoning_effort = "xhigh"
plan_mode_reasoning_effort = "xhigh"
model_reasoning_summary = "detailed"
model_verbosity = "high"
approval_policy = "never"
sandbox_mode = "danger-full-access"
[model_providers.xai]
name = "xai"
base_url = ""
wire_api = "responses"
requires_openai_auth = false
env_key = "XAI_API_KEY"
# launch
export XAI_API_KEY="sk-Xvs..."
codex --yolo
# ~/.codex/config.toml
[model_providers.xai]
name = "xai"
base_url = ""
env_key = "XAI_API_KEY"
wire_api = "chat"
requires_openai_auth = false
[profiles.minimax]
model = "MiniMax-M2.5"
model_provider = "xai"
# launch
export XAI_API_KEY="sk-Xvs..."
codex --profile minimax
:: Config file path
:: %USERPROFILE%\.codex\config.toml
:: Set API key
set XAI_API_KEY=sk-Xvs...
:: Option A (responses)
codex
:: Option B (chat)
codex --profile minimax
# Config file path
# $env:USERPROFILE\.codex\config.toml
$env:XAI_API_KEY="sk-Xvs..."
# Option A (responses)
codex
# Option B (chat)
codex --profile minimax
Verify with: codex (responses) or codex --profile minimax (chat)
Claude Code (gpt-5.4)
Claude Code integration is primarily environment-variable based.
The following examples map Claude defaults to
gpt-5.4 using
.
export XAI_API_KEY="sk-Xvs..."
export ANTHROPIC_AUTH_TOKEN="$XAI_API_KEY"
export ANTHROPIC_BASE_URL=""
# Optional: custom default model mapping for Claude families (not required)
export ANTHROPIC_DEFAULT_OPUS_MODEL="gpt-5.4"
export ANTHROPIC_DEFAULT_SONNET_MODEL="gpt-5.4"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="gpt-5.4"
# launch Claude Code
claude
set XAI_API_KEY=sk-Xvs...
set ANTHROPIC_AUTH_TOKEN=%XAI_API_KEY%
set ANTHROPIC_BASE_URL=
:: Optional: custom default model mapping for Claude families (not required)
set ANTHROPIC_DEFAULT_OPUS_MODEL=gpt-5.4
set ANTHROPIC_DEFAULT_SONNET_MODEL=gpt-5.4
set ANTHROPIC_DEFAULT_HAIKU_MODEL=gpt-5.4
claude
$env:XAI_API_KEY="sk-Xvs..."
$env:ANTHROPIC_AUTH_TOKEN=$env:XAI_API_KEY
$env:ANTHROPIC_BASE_URL=""
# Optional: custom default model mapping for Claude families (not required)
$env:ANTHROPIC_DEFAULT_OPUS_MODEL="gpt-5.4"
$env:ANTHROPIC_DEFAULT_SONNET_MODEL="gpt-5.4"
$env:ANTHROPIC_DEFAULT_HAIKU_MODEL="gpt-5.4"
claude
Verify with: claude
OpenCode (Responses: gpt-5.4 / Chat: MiniMax-M2.5)
OpenCode should use the global config file
~/.config/opencode/opencode.jsonc (Windows:
%USERPROFILE%\.config\opencode\opencode.jsonc).
Since your system supports protocol compatibility, use one of the
two complete config profiles below.
Selection order: choose API profile first (A = Responses, B = Chat), then copy the matching OS block (Linux / macOS or Windows).
# ~/.config/opencode/opencode.jsonc
{
"$schema": "https://opencode.ai/config.json",
"model": "openai/gpt-5.4",
"small_model": "openai/gpt-5.4",
"provider": {
"openai": {
"options": {
"baseURL": "",
"apiKey": "{env:XAI_API_KEY}"
}
}
}
}
export XAI_API_KEY="sk-Xvs..."
opencode debug config
:: File path: %USERPROFILE%\.config\opencode\opencode.jsonc
{
"$schema": "https://opencode.ai/config.json",
"model": "openai/gpt-5.4",
"small_model": "openai/gpt-5.4",
"provider": {
"openai": {
"options": {
"baseURL": "",
"apiKey": "{env:XAI_API_KEY}"
}
}
}
}
:: CMD
set XAI_API_KEY=sk-Xvs...
opencode debug config
:: PowerShell
$env:XAI_API_KEY="sk-Xvs..."
opencode debug config
# ~/.config/opencode/opencode.jsonc
{
"$schema": "https://opencode.ai/config.json",
"model": "xai-chat/MiniMax-M2.5",
"small_model": "xai-chat/MiniMax-M2.5",
"provider": {
"xai-chat": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "",
"apiKey": "{env:XAI_API_KEY}"
},
"models": {
"MiniMax-M2.5": {}
}
}
}
}
export XAI_API_KEY="sk-Xvs..."
opencode debug config
:: File path: %USERPROFILE%\.config\opencode\opencode.jsonc
{
"$schema": "https://opencode.ai/config.json",
"model": "xai-chat/MiniMax-M2.5",
"small_model": "xai-chat/MiniMax-M2.5",
"provider": {
"xai-chat": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "",
"apiKey": "{env:XAI_API_KEY}"
},
"models": {
"MiniMax-M2.5": {}
}
}
}
}
:: CMD
set XAI_API_KEY=sk-Xvs...
opencode debug config
:: PowerShell
$env:XAI_API_KEY="sk-Xvs..."
opencode debug config
curl /responses \
-H "Authorization: Bearer ${XAI_API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"model":"gpt-5.4",
"input":"Explain the purpose of a microservice gateway in one sentence"
}'
curl /chat/completions \
-H "Authorization: Bearer ${XAI_API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"model":"MiniMax-M2.5",
"messages":[{"role":"user","content":"Explain the purpose of a microservice gateway in one sentence"}]
}'
Verify with: opencode debug config (config) and opencode run "hello" (request)
OpenClaw
OpenClaw can connect to OpenAI API and Claude API, and can also
be extended to OpenAI Responses API. XAI Router supports OpenAI
API and Claude API by default; for Responses mode,
set api = "openai-responses". Config path:
~/.openclaw/openclaw.json on Linux / macOS, and
%USERPROFILE%\.openclaw\openclaw.json on Windows.
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/MiniMax-M2.5" }
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "",
"apiKey": "${XAI_API_KEY}",
"api": "openai-completions",
"models": [{ "id": "MiniMax-M2.5", "name": "MiniMax-M2.5" }]
}
}
}
}
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/MiniMax-M2.5" }
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "",
"apiKey": "${XAI_API_KEY}",
"api": "anthropic-messages",
"models": [{ "id": "MiniMax-M2.5", "name": "MiniMax-M2.5" }]
}
}
}
}
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/gpt-5.4" }
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "",
"apiKey": "${XAI_API_KEY}",
"api": "openai-responses",
"models": [{ "id": "gpt-5.4", "name": "gpt-5.4" }]
}
}
}
}
set XAI_API_KEY=sk-Xvs...
mkdir "%USERPROFILE%\.openclaw" 2>nul
notepad "%USERPROFILE%\.openclaw\openclaw.json"
openclaw models status
$env:XAI_API_KEY="sk-Xvs..."
New-Item -ItemType Directory -Path "$env:USERPROFILE\.openclaw" -Force | Out-Null
notepad "$env:USERPROFILE\.openclaw\openclaw.json"
openclaw models status
Verify with: openclaw models status