Clawhub Skill
Analysis
The skill appears to be a real local memory engine, but it grants persistent memory authority and includes under-disclosed command execution, local file handling, and optional cloud/credential paths that deserve careful review.
Findings (8)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
Checks for instructions or behavior that redirect the agent, misuse tools, execute unexpected code, cascade across systems, exploit user trust, or continue outside the intended task.
const proc = spawn(aiConfig.cli_command, [...aiConfig.cli_args, prompt], {The code can execute an arbitrary configured CLI command and pass the prompt to it. The main docs describe optional OpenAI-compatible API configuration, not this broader local command backend.
.map(p => p.startsWith('~/') ? p.replace('~', process.env.HOME || '') : p)The static scan also reports `readFileSync` in this context-prep path. Expanding `~/` paths from collected paths indicates local file access that is not clearly scoped in the user-facing documentation.
100% local memory. Zero cloud upload. Zero telemetry. ... LLM Rerank — NAM recalls 4x candidates, LLM picks the most relevant ones
The documentation makes strong zero-cloud claims while also describing optional cloud-compatible LLM providers that may process queries or recalled candidates.
API starts at localhost:18800 ... every minute: memory decay ... every 6h: L1→L2 topic clustering ... every 12h: L2→L3 mental model
The skill starts a persistent local service with recurring background work that continues beyond a single user request.
npm install @cc-soul/openclaw # API auto-starts at localhost:18800
The skill asks users to install an npm package and auto-start a service, while the registry section says there is no install spec for this skill.
Checks whether tool use, credentials, dependencies, identity, account access, or inter-agent boundaries are broader than the stated purpose.
"api_key": "your-key-here"
The optional LLM configuration uses a provider API key stored in `~/.cc-soul/data/ai_config.json`; this is expected for the integration but is not declared as a registry credential.
Checks for exposed credentials, poisoned memory or context, unclear communication boundaries, or sensitive data that could leave the user's control.
Base URL: `http://localhost:18800` ... `POST /memories` ... `POST /search`
The skill exposes memory storage and search over a local HTTP API, and the examples show no authentication, origin checks, or caller boundary.
cc-soul builds a word association network from your conversations... Every message updates word co-occurrence statistics
The system persistently learns from conversations and reuses that data for recall, persona selection, emotion tracking, and future context.
