Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
dream-memory
v1.0.0完整的工作区记忆管理系统。四层架构:文件存储 + OpenViking 向量引擎 + Ollama bge-m3 + Agent 规则。 Use when: 用户询问记忆系统如何工作、如何安装记忆系统到新 Agent、记忆文件结构、向量检索原理、 Session Flush 机制、长期记忆晋升规则、OpenVik...
⭐ 0· 90·0 current·0 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
medium confidencePurpose & Capability
The skill describes a memory stack (files + vector DB + local LLM + agent rules) and its instructions and scripts only touch local workspace files, an OpenViking vector service, and a local Ollama model—these map to the stated purpose.
Instruction Scope
SKILL.md and the self-check script explicitly read workspace files (MEMORY.md, AGENTS.md, etc.) and OpenClaw session/config files (~/.openclaw/...), which are relevant for session flush/check behavior. No instructions appear to collect or transmit data to external endpoints beyond local services.
Install Mechanism
There is no formal install spec, but references/ollama-setup.md recommends running `curl -fsSL https://ollama.com/install.sh | sh` and using `ollama pull bge-m3`. Download-and-execute via curl|sh is common for installers but is higher-risk than package installs; the URL is the official domain (ollama.com), which mitigates but does not eliminate risk.
Credentials
The skill declares no required env vars or secrets. The script optionally reads OPENCLAW_CONFIG or defaults to ~/.openclaw/openclaw.json—this is proportional to verifying memorySearch settings and session tracking.
Persistence & Privilege
The skill is not always-enabled and does not request special platform privileges. It can be invoked autonomously (normal default); combined with local-file access this is expected for a memory-management skill but users should be aware it may read agent session/config files when run.
Assessment
This skill appears coherent with its stated purpose (managing workspace memories using a local vector DB and local LLM). Before installing or running: (1) review the Ollama install script at https://ollama.com/install.sh yourself instead of piping blindly to sh; (2) be aware the skill and its self-check read ~/.openclaw/openclaw.json and agent sessions files—inspect those files for sensitive info you don't want read; (3) running ollama pull bge-m3 may download large model data and start a local service; (4) if you enable autonomous invocation, expect the skill to read and update local memory files/sessions as described. If you want higher assurance, request a signed/reviewed installer or run steps manually under an isolated account or container.Like a lobster shell, security has layers — review code before you run it.
latestvk97dz3g2f4z2ecwkqhq87cjqpn847y8f
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
