Back to skill
Skillv1.3.0
ClawScan security
Self-Repair System — Autonomous AI Automation · ClawHub's context-aware review of the artifact, metadata, and declared behavior.
Scanner verdict
BenignMar 12, 2026, 6:40 AM
- Verdict
- benign
- Confidence
- high
- Model
- gpt-5-mini
- Summary
- The skill's code, instructions, and requirements are consistent with an on‑host self‑repair/autonomy tool for Ollama and workspace files; no unexplained external credentials or install steps are requested, but review configuration (backup paths and the Ollama host) before use.
- Guidance
- This skill appears to do what it says: local health checks, repair from backups, and restarting a local Ollama instance. Before installing or enabling it: 1) Review and set backupPaths to directories you trust (the skill will copy files from backups into your workspace). 2) Ensure ollamaHost/ollamaPort remain set to localhost (the code defaults to localhost but can be configured to any host — pointing it at an external host would allow network traffic outside your machine). 3) Be aware it may start a detached local process if Ollama is missing; confirm the ollama binary paths it searches are appropriate for your system. 4) Run first in a controlled/sandbox environment and inspect repair-log.json to verify behavior. If you need stronger guarantees about network isolation, run the skill in an environment that enforces localhost-only network rules.
Review Dimensions
- Purpose & Capability
- okName and description (auto‑healing for Ollama, configs, filesystem) align with the included code: workspace integrity checks, config validation/repair, starting Ollama, scheduled routines, and logging. There are no unrelated credentials, binaries, or obscure install steps requested.
- Instruction Scope
- noteRuntime instructions and source are focused on local checks and repairs (file reads/writes, local HTTP checks to Ollama, spawning a local Ollama executable). One important caveat: hub.js and the SelfRepair constructor accept an ollamaHost/ollamaPort value — while defaults point to localhost, a user-supplied configuration could cause the code to contact a non‑local HTTP endpoint and use that remote service for prompts. The code otherwise avoids reading environment variables and limits activity to workspace paths and backupPaths provided at instantiation.
- Install Mechanism
- okNo install spec or external downloads are present; this is an instruction/code bundle. Nothing will be fetched or extracted during install by the skill itself.
- Credentials
- noteThe skill requests no environment variables or credentials. It does read/write files in the workspace, uses the OS homedir to probe for Ollama binaries, and will copy from backupPaths provided by the user. Because backupPaths and ollamaHost are configuration inputs, ensure those point only to trusted local locations and that you don't inadvertently point the skill at external hosts or untrusted backups.
- Persistence & Privilege
- okalways:false and no system-wide config changes are requested. The skill logs to a local repair-log.json and may spawn a detached Ollama process (using spawn with fixed args). Running detached processes and writing logs is expected for this purpose; it does not modify other skills or global agent settings.
