Back to skill
Skillv1.0.0

ClawScan security

Local-First LLM · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

BenignFeb 24, 2026, 10:27 PM
Verdict
benign
Confidence
high
Model
gpt-5-mini
Summary
The skill's code, instructions, and requirements are coherent with its stated purpose (routing to local LLMs and tracking savings); it only requires python3 and stores data in the user's home directory.
Guidance
This skill is internally consistent with its description. Before installing, note: (1) it will create and update ~/.openclaw/local-first-llm/savings.json to track requests — if you don't want that, remove or sandbox the directory; (2) the skill only probes localhost (ports 11434, 1234, 8080) to detect local LLM servers — ensure you trust services running on those ports; (3) reference docs include example third-party install commands (e.g., curl | sh for Ollama) — do not run those without vetting the upstream source; (4) the skill does not request any cloud API keys, but if you wire it to call cloud APIs yourself, review those calls separately. Overall: reasonable to use, but verify any external provider install steps and the location where savings are persisted.

Review Dimensions

Purpose & Capability
okName/description match the implemented behavior: scripts check local localhost endpoints, decide route, log savings to ~/.openclaw/local-first-llm/savings.json, and render a dashboard. Required binary (python3) is appropriate and proportional.
Instruction Scope
okSKILL.md instructs running included Python scripts and local HTTP calls to localhost LLM servers. The instructions do not ask to read unrelated system files, exfiltrate data, or call external endpoints (aside from documented local host endpoints). It does document how to invoke cloud fallbacks but does not itself transmit prompts to remote endpoints.
Install Mechanism
noteThere is no automated install spec (instruction-only). All code is included. Reference docs suggest installing third-party local providers (e.g., an example curl | sh for Ollama) — that is documentation only, not performed by the skill; users should vet any external install commands before running them.
Credentials
okThe skill requests no environment variables or external credentials. It only reads/writes a per-user file under ~/.openclaw and queries localhost ports for provider detection — these are proportionate to the functionality.
Persistence & Privilege
okThe skill writes its own data file (~/.openclaw/local-first-llm/savings.json) and does not claim always:true or modify other skills or system-wide settings. Default autonomous invocation is allowed (platform default) and is not combined with other concerning privileges.