Back to skill
Skillv1.0.3

ClawScan security

MemVault · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

ReviewMar 2, 2026, 3:39 AM
Verdict
Review
Confidence
medium
Model
gpt-5-mini
Summary
The skill appears to implement the advertised memory server, but the installer auto-downloads/runs third‑party installers (curl | sh for Ollama), the package expects and creates many environment settings not declared in the registry, and running it may send stored memories to whatever LLM endpoint you configure — review before installing.
Guidance
This package is functionally consistent with a self‑hosted long‑term memory server, but take caution before running the installer. Review scripts/install.sh and avoid blindly running curl | sh from the network; consider installing Ollama manually or configuring MEMVAULT_LLM_BASE_URL to a known local endpoint. Update the default DB password (postgres/postgres) and confirm where MEMVAULT_LLM_BASE_URL points — if you set it to a cloud LLM (OpenAI/Groq), your stored memories will be transmitted to that provider. If you must test, run inside an isolated environment (VM/container) and inspect docker-compose and Dockerfile builds so you can audit downloaded models and packages. If you want lower risk, you can skip the auto‑installer and manually start the docker-compose build after reviewing files.

Review Dimensions

Purpose & Capability
noteThe code, Dockerfile, docker-compose, and CLI match the stated purpose (a long‑term memory server with embeddings, decay, and retrieval). However the registry metadata says 'required env vars: none' while the code and docker-compose rely on many environment variables (DB DSN, LLM base URL, API key, embedding URL, etc.). That mismatch is unexpected but plausibly an omission rather than outright malice.
Instruction Scope
noteSKILL.md instructs you to run the included install script and then call local endpoints and cron jobs. The runtime instructions themselves are scoped to installing and operating the service (memorize, retrieve, decay). They do not instruct arbitrary file system reads. Caveat: troubleshooting text references an OpenClaw workspace path which may not exist in all installs (minor inconsistency).
Install Mechanism
concernThe provided scripts/install.sh will attempt to auto-install Ollama on Linux by executing a remote script via curl -fsSL https://ollama.com/install.sh | sh — this is a high‑risk pattern (running a remote installer without review). The installer also starts background processes (ollama serve) and runs docker compose up --build which will download images, pip packages, and pre-download embedding models during the Docker build. These are expected for this project but are higher risk than an instruction-only skill; review the installer and the remote install script before running.
Credentials
concernAlthough the registry metadata declared no required environment variables, the code and docker-compose rely on many env vars (MEMVAULT_DB_DSN, MEMVAULT_LLM_BASE_URL, MEMVAULT_LLM_API_KEY, MEMVAULT_EMBEDDING_URL, etc.). Defaults include cleartext DB credentials (postgres/postgres) in the compose file and the installer creates a .env. If you point MEMVAULT_LLM_BASE_URL to a public/cloud LLM (OpenAI, etc.), memories and potentially sensitive content will be sent to that provider. The skill may therefore handle secrets/PII; you should explicitly set appropriate credentials and endpoints and avoid using public LLMs if you want to keep data local.
Persistence & Privilege
okThe skill does not request permanent platform presence (always:false). Installer creates a CLI symlink in ~/.local/bin and writes a .env in the skill directory and uses a Docker volume for DB persistence; these are normal for a self‑hosted service and do not modify other skills or system-wide agent settings.