Back to skill
Skillv1.0.0

ClawScan security

Windows Local Embedding · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

BenignMar 14, 2026, 11:02 AM
Verdict
benign
Confidence
high
Model
gpt-5-mini
Summary
The skill's instructions, downloads, and edits are coherent with its stated purpose of enabling local embeddings on Windows for OpenClaw; nothing requests unrelated credentials or hidden endpoints, but follow normal safety precautions (backup configs, verify downloaded files, and be aware npm will place native binaries).
Guidance
This skill is coherent and focused on enabling local embeddings on Windows. Before proceeding: (1) backup your C:\Users\...\.openclaw\openclaw.json, (2) verify the HuggingFace download URL and that the file header reads 'GGUF' and file size ≈139 MB, (3) be aware 'npm install node-llama-cpp' will download native binaries and may require admin rights — inspect or run in a test environment if unsure, (4) the provided verification script imports a module from your OpenClaw installation (call-C5sk0PsH.js); inspect that file if you have concerns about running code from the app directory, and (5) if you want stricter isolation, perform the model installation and testing on a non-critical machine or VM first.

Review Dimensions

Purpose & Capability
okThe skill's name/description (Windows local embedding for OpenClaw) matches the actions it instructs: download a GGUF embedding model from HuggingFace, install node-llama-cpp, edit openclaw.json to set provider=local, restart OpenClaw, and run a local verification script. All required steps are reasonable for this stated goal.
Instruction Scope
okSKILL.md stays on-topic: it verifies the model file header, guides where to place the model, how to install the native dependency, how to edit the OpenClaw config, and how to validate via a local gateway call. It does not instruct reading unrelated system files, transmitting data to third-party endpoints (beyond downloading the model from HuggingFace), or collecting secrets.
Install Mechanism
okNo built-in install spec; user-run steps include downloading the model from a direct HuggingFace URL and running 'npm install node-llama-cpp' in the OpenClaw resources folder. Both are expected for this use case; the HuggingFace URL is an official release-hosted link and npm is the standard registry. Note: node-llama-cpp pulls precompiled native binaries.
Credentials
okThe skill requests no environment variables, credentials, or external config paths. The only modifications are to the user's OpenClaw config file and to install a dependency inside the OpenClaw application folder — both proportional to enabling local embeddings.
Persistence & Privilege
noteThe skill asks the user to modify OpenClaw's configuration and to install a native dependency into Program Files/resources. This requires file-write privileges (may need elevated permissions) and will change the local application state, which is expected for this task but worth noting before proceeding.