Install
openclaw skills install claude-code-custom-model-proxyConfigure Claude Code to work with custom model providers (like MiniMax, OpenAI-compatible APIs). This skill should be used when users want to: use Claude Code with non-Anthropic models, set up a proxy to convert between Anthropic and OpenAI API formats, troubleshoot Claude Code connection issues with custom endpoints.
openclaw skills install claude-code-custom-model-proxyThis skill helps configure Claude Code to work with custom model providers that use OpenAI API format (like MiniMax) by setting up a proxy server that converts between Anthropic Messages API and OpenAI Chat Completions API.
Claude Code uses Anthropic Messages API format (/v1/messages), but many custom model providers (like MiniMax) use OpenAI Chat Completions API format (/v1/chat/completions). This skill provides a Python proxy server that:
http://127.0.0.1:4002Create or edit ~/.claude/settings.json:
{
"env": {
"ANTHROPIC_BASE_URL": "http://127.0.0.1:4002",
"ANTHROPIC_API_KEY": "fake-key-not-needed"
}
}
Or use environment variables:
export ANTHROPIC_BASE_URL="http://127.0.0.1:4002"
export ANTHROPIC_API_KEY="fake-key"
python3 ~/.workbuddy/skills/claude-code-custom-model-proxy/scripts/claude_code_proxy.py
Or run in background:
nohup python3 ~/.workbuddy/skills/claude-code-custom-model-proxy/scripts/claude_code_proxy.py > /tmp/claude_proxy.log 2>&1 &
claude --model sonnet
Edit scripts/claude_code_proxy.py to configure:
UPSTREAM_HOST: Your provider's API host (e.g., "api.53hk.cn")UPSTREAM_PATH: API path (e.g., "/v1/chat/completions")API_KEY: Your provider's API keyLISTEN_PORT: Proxy listen port (default: 4002)"MiniMax-M2.7-highspeed" to your modelCause: Claude Code validates model names locally before connecting to the API.
Solution: The proxy's GET /v1/models endpoint must return the model name Claude Code expects.
For --model sonnet, Claude Code expects claude-sonnet-4-6 in the models list.
The proxy already includes common model names in its response. Add more if needed:
models = {
"data": [
{"type": "model", "id": "claude-sonnet-4-6", "display_name": "Claude Sonnet 4.6"},
{"type": "model", "id": "claude-opus-4-5", "display_name": "Claude Opus 4.5"},
# Add more models as needed
]
}
Cause: Claude Code sends requests with query strings (e.g., POST /v1/messages?beta=true), but the proxy only checks self.path == "/v1/messages".
Solution: The proxy now uses urlparse() to extract the path without query string:
from urllib.parse import urlparse
parsed_path = urlparse(self.path)
path = parsed_path.path # This removes ?beta=true
Cause: Incorrect handling of UTF-8 encoding in SSE streaming.
Solution: Use byte buffer instead of string buffer:
buffer = b"" # Byte buffer
for chunk in r.iter_content(chunk_size=None, decode_unicode=False):
if chunk:
buffer += chunk
while b"\n" in buffer:
line_bytes, buffer = buffer.split(b"\n", 1)
line = line_bytes.strip().decode("utf-8", errors="replace")
Cause: Proxy server is not running.
Solution: Start the proxy server before starting Claude Code. Check with:
lsof -i :4002
Request conversion (anthropic_to_openai()):
messages array: Extract text from content blocksmax_tokens → max_tokenstemperature → temperaturestream: true (always enabled)Response conversion (openai_to_anthropic()):
message_startcontent_block_startcontent_block_deltacontent_block_stopmessage_deltamessage_stoptail -f /tmp/claude_proxy.log
# Test models endpoint
curl -s http://127.0.0.1:4002/v1/models | python3 -m json.tool
# Test messages endpoint
curl -X POST http://127.0.0.1:4002/v1/messages \
-H "Content-Type: application/json" \
-d '{"model":"claude-sonnet-4-6","messages":[{"role":"user","content":"Hello"}],"max_tokens":100}'
ls -lt ~/.claude/debug/*.txt | head -1
tail -50 ~/.claude/debug/<latest>.txt
~/.workbuddy/skills/claude-code-custom-model-proxy/
├── SKILL.md # This file
└── scripts/
└── claude_code_proxy.py # Proxy server
In claude_code_proxy.py, line 44:
"model": "MiniMax-M2.7-highspeed", # Change this to your model
In claude_code_proxy.py:
LISTEN_PORT = 4002 # Change to your preferred port
Then update ANTHROPIC_BASE_URL accordingly.
The proxy already includes retry logic (call_upstream_with_retry()). Configure:
MAX_RETRIES: Maximum retry attempts (default: 3)BASE_WAIT_SECONDS: Base wait time between retries (default: 10)