File-system + vector-powered memory skill for OpenClaw — semantic recall, daily journaling, and safeguard flushing, all running locally via Ollama
AdvisoryAudited by Static analysis on Apr 30, 2026.
Overview
No suspicious patterns detected.
Findings (0)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
If the memory directory contains symlinks, the skill may read or overwrite files outside the intended memory folder when given that path.
The guard normalizes the path string but does not use realpath/lstat to resolve symlinks before writing. A symlink inside memoryDir could therefore point writes outside the intended memory folder, despite the documented boundary.
resolvedPath = path.resolve(resolvedPath); ... if (!resolvedPath.startsWith(resolvedMemDir + path.sep) && resolvedPath !== resolvedMemDir) { ... } ... await fs.writeFile(fullPath, newContent, 'utf-8');Resolve and validate real paths with fs.realpath/lstat before reads and writes, reject symlinked paths or use no-follow semantics where possible, and keep memoryDir as a private directory without symlinks.
Memories from other groups or contexts could appear in search results if the agent searches without a group filter or if the group directory layout does not match the code.
When no group is supplied, search scans the entire memoryDir recursively rather than enforcing the current session's group boundary. This matters because the skill advertises group-isolated memories.
let searchDir = resolvedBaseDir; if (group) { ... searchDir = path.join(resolvedBaseDir, 'groups', safeGroup); } ... await scan(searchDir);Bind memory tools to the current session/group by default, align the documented and implemented group paths, and require explicit user approval for cross-group or global searches.
Old memory entries may influence later agent behavior or expose past conversation details in new sessions.
The skill automatically loads persisted Markdown memory into future sessions. This is purpose-aligned, but stored content can be sensitive, stale, or instruction-like.
filesToLoad.push('MEMORY.md'); ... filesToLoad.push(`memory/${today}.md`); ... return { loaded, content: combinedContent };Review and edit the memory files periodically, keep memoryDir private, and use manual or disabled flush modes if automatic persistence is not desired.
Memory content may be processed by the local Ollama service during semantic search.
When vector search is enabled, queries and memory chunks are sent to the local Ollama service for embeddings. This is disclosed and local, but it is still a data flow users should understand.
const OLLAMA_BASE_URL = 'http://localhost:11434'; ... body: JSON.stringify({ model, prompt: cleanText })Enable vector search only if you trust the local Ollama setup, and keep Ollama bound to localhost rather than exposing it on a network interface.
