CLI AI Proxy

v0.1.3

Manage cli-ai-proxy: local OpenAI-compatible proxy that routes requests through Gemini CLI and Claude Code. The proxy itself reads no credentials; the underl...

0· 67·0 current·0 all-time
byLeo Liao@ilzc
Security Scan
Capability signals
Requires OAuth tokenRequires sensitive credentials
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
medium confidence
Purpose & Capability
Name/description align with requested binaries (node/npm and gemini/claude) and with what the scripts do (install and run the cli-ai-proxy binary). Requiring Node and an npm package is expected for this purpose.
Instruction Scope
Runtime instructions and scripts call the installed cli-ai-proxy binary (start/stop/status/health) and document the proxy's behavior. The proxy will write config.yaml, .proxy.pid, proxy.log and temporary image files in its working directory, and the configure script edits ~/.openclaw/openclaw.json (with a backup). These are expected but worth noting since they mutate a user config and write local files.
Install Mechanism
Install uses npm install -g cli-ai-proxy from the public registry, which is appropriate for a Node CLI. This is a moderate-risk install vector because remote npm packages can contain arbitrary code; the skill claims the package has no install hooks, but the remote package contents were not provided for verification.
Credentials
The skill itself requires no additional environment variables and defers authentication to the gemini/claude CLIs (the docs explicitly state GEMINI_API_KEY/ANTHROPIC_API_KEY are consumed by those CLIs). The environment variables referenced in docs are proportional and related to the CLIs or proxy config.
Persistence & Privilege
always:false (no forced/global install). The only persistent modification described is optional: configure-provider.sh edits ~/.openclaw/openclaw.json (it claims to make a timestamped .bak). This is coherent with integrating the proxy as a model provider but is a notable config mutation the user should consent to.
Assessment
This skill appears to do what it says: install an npm CLI that proxies OpenAI-style requests through your installed Gemini/Claude CLIs and optionally registers itself in your OpenClaw config. Before installing: (1) inspect the npm package (npm view cli-ai-proxy, npm pack or review its source) to confirm there are no unexpected postinstall hooks or malicious code; (2) prefer installing with an unprivileged Node setup (nvm/fnm/volta) rather than sudo to avoid system-wide changes; (3) verify maintainers and package version on the registry; (4) keep the proxy bound to 127.0.0.1 (do not change server.host to 0.0.0.0) because the proxy is unauthenticated and exposes permissive CORS; (5) be aware running configure-provider.sh will modify ~/.openclaw/openclaw.json (a backup should be created) and may change your agent's default model. If you cannot or do not want to trust the remote npm package, consider extracting and reviewing the package tarball first or running the package in a confined environment.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

🔀 Clawdis
Binsnode, npm
Any bingemini, claude

Install

Install cli-ai-proxy via npm
Bins: cli-ai-proxy
npm i -g cli-ai-proxy
claudevk9774c8s4ha4xz0391g0qxqfm983ycq1geminivk9774c8s4ha4xz0391g0qxqfm983ycq1latestvk97dxgmj080ahjgwzbh5vea7gs85a3ptproxyvk9774c8s4ha4xz0391g0qxqfm983ycq1
67downloads
0stars
3versions
Updated 2h ago
v0.1.3
MIT-0

CLI AI Proxy

Local OpenAI-compatible proxy that bridges Gemini CLI and Claude Code to a unified REST API. Requests go through the installed CLI tools — the proxy makes no direct AI-vendor API calls and holds no API keys. Authentication (OAuth, API keys, session tokens) is managed by the gemini / claude CLIs themselves.

What This Installs

This skill installs the cli-ai-proxy package from the public npm registry. Specifically:

  • Source: public npm registry package cli-ai-proxy (no GitHub clone, no local build step)
  • Postinstall scripts: none — the package's package.json declares no preinstall/postinstall hooks
  • Runtime dependencies: one — yaml (config parsing)
  • Filesystem writes: only under the global npm prefix (for the binary) and, when the proxy runs, under its working directory for config.yaml, .proxy.pid, and proxy.log
  • Config mutations: only if you explicitly run configure-provider.sh / cli-ai-proxy configure-openclaw, which edits ~/.openclaw/openclaw.json and writes a .bak backup next to it first
  • Network at runtime: localhost HTTP server only; outbound calls are performed by the user's installed gemini / claude CLIs, not by the proxy itself

When to Use

✅ User asks to start/stop/check the AI proxy ✅ User wants to route requests through Gemini CLI or Claude Code ✅ User asks about available models or proxy health ✅ User wants to configure OpenClaw to use the proxy ✅ Troubleshooting proxy connectivity or CLI issues

❌ Direct API calls to OpenAI/Anthropic/Google (this proxy only uses CLI tools) ❌ Managing API keys (CLIs handle their own authentication)

Quick Reference

ActionCommand
Start proxy{baseDir}/scripts/start.sh
Stop proxy{baseDir}/scripts/stop.sh
Check status{baseDir}/scripts/status.sh
Health check{baseDir}/scripts/health.sh
Configure OpenClaw{baseDir}/scripts/configure-provider.sh
Full install{baseDir}/scripts/install.sh

Proxy Lifecycle

Starting

{baseDir}/scripts/start.sh

Starts the proxy on 127.0.0.1:9090 (default). The proxy listens for OpenAI-compatible requests and routes them to the appropriate CLI tool.

Before starting, verify at least one CLI is available:

  • gemini --version (Gemini CLI)
  • claude --version (Claude Code)

Checking Status

{baseDir}/scripts/status.sh

Shows: running/stopped, PID, health endpoint data, available CLI providers, concurrency stats.

Stopping

{baseDir}/scripts/stop.sh

Gracefully shuts down the proxy: stops accepting connections, kills active CLI subprocesses, cleans up.

Available Models

Model IDProviderBackend Model
geminiGemini CLICLI default (auto-upgrades)
claudeClaude CodeCLI default (auto-upgrades)
claude-sonnetClaude Codesonnet
claude-opusClaude Codeopus

When OpenClaw is configured, use as cli-ai-proxy/gemini, cli-ai-proxy/claude, etc.

OpenClaw Integration

To configure OpenClaw to route through the proxy:

{baseDir}/scripts/configure-provider.sh

This automatically:

  1. Adds cli-ai-proxy as a provider in ~/.openclaw/openclaw.json
  2. Registers all proxy models in the agent defaults
  3. Creates a backup of the original config

After configuring, set the default model in openclaw.json:

{ "agents": { "defaults": { "model": { "primary": "cli-ai-proxy/gemini" } } } }

API Endpoints

The proxy exposes:

  • POST /v1/chat/completions — Chat completions (streaming + non-streaming)
  • GET /v1/models — List available models
  • GET /health — Health check with provider status and concurrency info

Default base URL: http://127.0.0.1:9090/v1

For full API details see references/api.md.

Image Support

The proxy supports images in messages. When a request contains image_url content parts:

  1. Images are saved to temporary files
  2. The prompt instructs the CLI to read the image via its built-in file tools
  3. Temp files are automatically cleaned up after each request

Supports both base64 data URLs and remote image URLs.

Configuration

Config file: config.yaml in the proxy installation directory.

Key settings:

  • server.port — Listen port (default: 9090)
  • concurrency.max — Max concurrent CLI processes (default: 5)
  • timeout — CLI process timeout in ms (default: 300000)
  • defaultModel — Default model when none specified

For full configuration options see references/configuration.md.

Troubleshooting

Proxy won't start

  1. Check if port 9090 is already in use: lsof -i :9090
  2. Verify Node.js is available: node --version
  3. Check logs: read the proxy.log file in the installation directory

CLI not available

  1. Verify CLI is installed and in PATH: which gemini or which claude
  2. Check CLI auth: gemini --version or claude --version
  3. The proxy health endpoint shows which CLIs are available

429 Too Many Requests

The concurrency limit has been reached. Either:

  • Wait for current requests to complete
  • Increase concurrency.max in config.yaml

Timeout errors (504)

The CLI process took too long. Either:

  • Increase timeout in config.yaml
  • Check if the CLI is hanging (auth issues, network)

For more troubleshooting see references/troubleshooting.md.

Comments

Loading comments...