Install
openclaw skills install nadirclawInstall, configure, and run NadirClaw LLM router to cut AI API costs by 40-70%. Use when the user wants to reduce LLM spending, route prompts to cheaper models, set up cost-saving proxy, or optimize API usage across providers (OpenAI, Anthropic, Google, Ollama). Also use when asked about model routing, LLM cost optimization, or setting up NadirClaw with OpenClaw.
openclaw skills install nadirclawNadirClaw is an open-source LLM router that classifies prompts in ~10ms and routes simple ones to cheap/local models while keeping complex work on premium models.
pip install nadirclaw
Run the interactive wizard:
nadirclaw setup
Or auto-configure for OpenClaw:
nadirclaw openclaw onboard
This writes NadirClaw as a provider in OpenClaw config with model nadirclaw/auto. No restart needed.
nadirclaw serve --verbose
Runs on http://localhost:8856. Any OpenAI-compatible tool can use it by pointing to this URL.
# OpenClaw (auto)
nadirclaw openclaw onboard
# Claude Code
ANTHROPIC_BASE_URL=http://localhost:8856/v1 claude
# Any OpenAI-compatible tool
OPENAI_BASE_URL=http://localhost:8856/v1 <tool>
Pass x-routing-profile header or use these models:
nadirclaw/auto - smart routing (default)nadirclaw/eco - maximize savingsnadirclaw/premium - always use best modelnadirclaw/free - Ollama/local onlynadirclaw/reasoning - chain-of-thought optimizednadirclaw savings # cost savings report
nadirclaw report # detailed routing analytics
nadirclaw dashboard # live terminal dashboard
nadirclaw serve fails, check API keys: nadirclaw setupollama serve is running firstnadirclaw report --last 20 to see recent routing decisionsnadirclaw serve --verbose --log-raw