Adaptive Routing
v1.0.0Routes LLM requests to a local model first (Ollama, LM Studio, llamafile), validates the response quality, and escalates to cloud only when the local result...
⭐ 0· 382·1 current·1 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description align with included scripts and docs. The skill only needs python3, checks local LLM HTTP endpoints (Ollama/LM Studio/llamafile), decides routing, validates responses, and records outcomes. No unrelated credentials, binaries, or external services are requested by the skill itself.
Instruction Scope
SKILL.md and scripts remain within the declared scope: they query localhost endpoints, run local validation heuristics, and persist usage data under ~/.openclaw/adaptive-routing/. The skill does not send user prompts to third-party endpoints itself. Note: it creates/configures files in the user's home directory and runs subprocesses/HTTP requests to localhost (expected for detecting local LLMs).
Install Mechanism
This is an instruction-only skill with bundled Python scripts and no install spec. Nothing is downloaded or extracted by the skill itself. The references mention upstream provider install commands (e.g., curl | sh for Ollama) — those are provider instructions, not the skill's installer.
Credentials
No environment variables or credentials are required by the skill. The scripts include secret-detection/redaction patterns to avoid logging API keys and bearer tokens. The skill stores only metadata (tokens, model, outcome) in savings.json — it does not persist prompt contents or credentials.
Persistence & Privilege
The skill persists data under ~/.openclaw/adaptive-routing/ (config.json, savings.json) and does not request always:true or modify system-wide/other-skills configuration. It runs only when invoked and does not demand elevated privileges.
Assessment
This skill appears coherent and implements what it claims. Before installing, consider: (1) it will create ~/.openclaw/adaptive-routing/ (config and savings files) — review or remove these if you don't want local persistence; (2) it probes localhost ports to detect local LLM servers — if you run untrusted services locally that could be fingerprinted, be aware of that network activity; (3) the docs reference provider install commands (including curl | sh for Ollama) — running remote install scripts can be risky; prefer official release packages from trusted sources; (4) the redaction logic reduces risk of logging API keys, but review the scripts if you plan to feed highly sensitive prompts; (5) no cloud credentials are required by the skill itself — escalation to cloud is left to your environment and tooling. If you want extra assurance, inspect the bundled Python scripts (they are small and readable) or run them in a restricted/test account first.Like a lobster shell, security has layers — review code before you run it.
latestvk979xn00zy6v36y7d57ypnn0hn82548f
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
Runtime requirements
🔀 Clawdis
Binspython3
