suspicious.install_untrusted_source
- Location
- config.example.yaml:2
- Finding
- Install source points to URL shortener or raw IP.
AdvisoryAudited by Static analysis on May 10, 2026.
Detected: suspicious.install_untrusted_source
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
If the agent or a user supplies the wrong path, this helper can overwrite or append to arbitrary writable files, not just memory notes.
The helper writes to an absolute target or a workspace-relative target without checking that the final path is under the intended memory directory or is a Markdown memory file.
if [[ "$TARGET" != /* ]]; then ABS_TARGET="$WORKSPACE/$TARGET" fi ... cat "$CONTENT_FILE" > "$ABS_TARGET"
Constrain the target to a canonical path under workspace/memory, reject absolute paths and '..' traversal, require a .md suffix, and ask for confirmation before overwrite mode.
Incorrect, sensitive, or malicious content placed in memory can influence later answers and may be retained in local index files.
The skill intentionally persists and reuses Markdown memory plus a semantic index across sessions.
Treat the Markdown files as the source of truth... DB path: `memory/.semantic-index.db`... Daily warmup: `session-start.sh` runs `memory-semantic-search.py --build-only` once per day
Keep secrets out of memory, review hard-rules/current-state files regularly, and delete or rebuild the semantic index when removing sensitive memory.
With the default local Ollama setup this stays on the machine, but changing the base URL could send memory contents to another service.
Memory/query text is sent to the configured Ollama-compatible embedding endpoint. The default endpoint is localhost, but the URL is configurable.
payload = json.dumps({"model": model, "prompt": text}).encode("utf-8")
...
f"{base_url.rstrip('/')}/api/embeddings"Keep the Ollama base URL pointed at a trusted local endpoint unless you intentionally want remote processing, and do not store credentials or secrets in memory files.
The skill depends on locally installed packages and models whose exact resolved artifacts are not pinned in the provided install metadata.
The setup flow downloads Python dependencies and local model artifacts outside the registry install spec.
uv sync ollama pull qwen3-embedding:0.6b ollama pull gemma4:e4b
Install from a trusted source, consider using a lockfile or pinned package versions, and verify Ollama model sources before production use.