Back to skill
Skillv2.0.0

ClawScan security

Council Brief · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

SuspiciousFeb 22, 2026, 9:58 PM
Verdict
suspicious
Confidence
medium
Model
gpt-5-mini
Summary
The skill mostly does what its description claims (clone and run an LLM Council app), but it reads/writes local credential files and starts networked services while the registry metadata does not declare those credentials or config-path accesses—this mismatch and the installer’s aggressive process/port management merit caution.
Guidance
Before installing: understand that this skill will clone and run code from GitHub and will attempt to obtain your LLM API access (OPENROUTER_API_KEY) either from your environment, from ~/.openclaw/workspace/.env, or by reading ~/.openclaw/openclaw.json (gateway token). It will write those credentials into a .env file inside the cloned project and run 'uv sync' and 'npm ci', then start backend and frontend services that bind to the host and may be reachable from your local network. It also will kill processes listening on ports 8001/5173/4173 if present. Recommended actions: (1) verify the upstream repository (https://github.com/jeadland/llm-council) and review the code it will run; (2) back up or inspect ~/.openclaw/openclaw.json and workspace .env before use; (3) consider running the installer in an isolated environment/container or a throwaway VM to avoid exposing credentials or disrupting host services; (4) ensure you are comfortable with writing keys to disk in the project .env; and (5) if you need the quick query feature but not the web UI, prefer running only the headless ask script after manually reviewing it.

Review Dimensions

Purpose & Capability
concernThe skill's name/description match the included scripts: it installs, runs, and queries an LLM Council app. However the registry metadata declared no required env vars or config paths while the installer clearly expects/uses OPENROUTER_API_KEY (env or workspace .env) and may read ~/.openclaw/openclaw.json for a gateway token/port. The missing declaration of required credentials/config paths is an incoherence.
Instruction Scope
concernSKILL.md instructs the agent to invoke the bundled bash router which in turn may clone a GitHub repo, write a .env file containing API credentials into the project directory, run 'uv sync' (Python dependency actions), run 'npm ci' and start backend and frontend services. The installer reads ~/.openclaw/workspace/.env and ~/.openclaw/openclaw.json (not declared in metadata) and will produce network traffic to GitHub and (via the app) to the chosen LLM endpoint. It also exposes a web UI (Vite) and prints local URLs. These actions go beyond simple local formatting/lookup: they touch local credential files and launch network-exposed services.
Install Mechanism
noteThere is no explicit registry 'install' spec, but the packaged scripts perform a 'git clone https://github.com/jeadland/llm-council' and run native tooling (uv sync, npm ci). Cloning from GitHub is a standard release host. 'uv sync' and 'npm ci' will pull and execute third-party packages—this is expected but means runtime code will be installed and executed on the host.
Credentials
concernThe skill requires an API credential (OPENROUTER_API_KEY) or an OpenClaw gateway token; the installer will read these from the environment, ~/.openclaw/workspace/.env, or ~/.openclaw/openclaw.json and then write them into a .env file in the cloned repo. The registry metadata claims no required credentials, so this access to local secrets/config is not declared and thus disproportionate. Reading the OpenClaw gateway token from openclaw.json is sensitive and should be explicitly declared.
Persistence & Privilege
noteThe skill does not request 'always: true' and is user-invocable only. It will however start background services (backend on :8001, frontend on :5173 or :4173), write a PID file in the skill directory, place logs in /tmp, and forcibly kill processes listening on configured ports (8001, 5173, 4173) if they exist. The frontend is started with --host 0.0.0.0, exposing it to the local network. Killing arbitrary PIDs on ports and binding services to 0.0.0.0 are powerful actions that can impact other services on the host.