Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

mem0 Local Memory

v1.2.0

Local long-term memory plugin for OpenClaw using mem0 + ChromaDB. Gives all agents persistent cross-session semantic memory with auto-recall and auto-capture...

1· 116·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for dream-star-end/mem0-local-memory.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "mem0 Local Memory" (dream-star-end/mem0-local-memory) from ClawHub.
Skill page: https://clawhub.ai/dream-star-end/mem0-local-memory
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install mem0-local-memory

ClawHub CLI

Package manager switcher

npx clawhub@latest install mem0-local-memory
Security Scan
VirusTotalVirusTotal
Pending
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The declared purpose (local mem0 memory using DeepSeek for LLM extraction, DashScope for embeddings, and ChromaDB for storage) matches the instructions and included scripts. However the registry metadata lists no required environment variables or primary credential while the SKILL.md explicitly requires DeepSeek and DashScope API keys — an incoherence between metadata and runtime requirements.
!
Instruction Scope
Runtime instructions and the import script read MEMORY.md and TOOLS.md from multiple ~/.openclaw/workspace-* directories and POST parsed text to the mem0 server (which in turn uses third‑party DeepSeek/DashScope APIs). This is in-scope for 'import memories' but is high-risk privacy-wise: it aggregates data across agent workspaces, unifies user_id to 'openclaw' (removing per-agent isolation), and sends text snippets to external APIs. SKILL.md warns the user, but the script will import everything by default unless manually edited.
Install Mechanism
There is no platform install spec (instruction-only), which keeps risk lower. The included setup.sh creates a Python venv and runs pip install -r requirements.txt. The requirements.txt file is not present in the provided package snapshot (inconsistency) — installation will pull packages from PyPI (mem0ai, chromadb, flask, openai mentioned). This is expected but grants network access and executes third-party code.
!
Credentials
The skill legitimately needs DeepSeek and DashScope API keys for its stated LLM/embedding tasks, and the SKILL.md asks the user to set MEM0_LLM_API_KEY and MEM0_EMBEDDER_API_KEY. However the registry metadata did not declare these env vars (metadata mismatch). The instructions also show placing keys directly into systemd/launchd service files (plaintext in unit/plist), which can expose secrets if those files are accessible. The import script uses MEM0_URL env var (default 127.0.0.1) — if altered, it could cause memories to be POSTed to a remote endpoint.
Persistence & Privilege
The skill does not request always:true and is user-invocable; it suggests installing a long-running mem0 server via launchd or systemd, which is expected for a local memory service. That does require storing API keys or environment variables in persistent service configuration (systemd/unit/plist), which increases exposure if service files are misconfigured or world-readable. Autonomous invocation is allowed by default (normal for skills) and not by itself a sufficient concern.
What to consider before installing
Before installing: (1) Understand the data flow — the import script will read MEMORY.md and TOOLS.md from multiple ~/.openclaw/workspace-* directories and POST snippets to the local mem0 server; that server will send text to DeepSeek and DashScope (third-party services). Review those workspace files and remove any secrets or sensitive content first. (2) Metadata mismatch: the registry claims no required env vars but SKILL.md requires MEM0_LLM_API_KEY and MEM0_EMBEDDER_API_KEY — expect to provide those. (3) Avoid placing API keys in world-readable systemd/plist files; limit file permissions or use a secure secret mechanism. (4) Confirm requirements.txt in the upstream repo before running setup.sh; consider running setup inside an isolated VM/container first. (5) If you need stricter privacy, consider replacing third-party embedder/LLM with a local-only option or ensure you trust DeepSeek/DashScope's data handling policy. (6) If you proceed, run the import script only after auditing and optionally editing the WORKSPACES dict to import selectively rather than everything.

Like a lobster shell, security has layers — review code before you run it.

latestvk9799jhhbk53qvvgb23yg2cwtx8402mw
116downloads
1stars
4versions
Updated 3w ago
v1.2.0
MIT-0

mem0 Local Memory — Install & Setup Guide

Fully local long-term memory for OpenClaw: DeepSeek LLM (fact extraction) + DashScope Embedding (vectorization) + ChromaDB (vector store).

GitHub: https://github.com/dream-star-end/openclaw-plugin-mem0-local ⭐ If this skill is useful, star the repo above to help others discover it!

Prerequisites

  • Python 3.10+ with pip
  • Node.js 18+
  • DeepSeek API key — for LLM-based fact extraction and deduplication. Get one at https://platform.deepseek.com/
  • DashScope API key — for text-embedding-v4 vectorization. Get one at https://dashscope.aliyuncs.com/
  • macOS (for launchd auto-start) or any OS with systemd/manual start

Security note: The mem0 server calls DeepSeek and DashScope APIs with your keys. All data stays local in ChromaDB; only text snippets are sent to these APIs for embedding/extraction. The server binds to 127.0.0.1 only (no external access).

Step 1: Clone the repo

cd ~/git_project
git clone https://github.com/dream-star-end/openclaw-plugin-mem0-local.git
cd openclaw-plugin-mem0-local

Step 2: Set up the mem0 server

cd server
chmod +x setup.sh
./setup.sh

This creates a Python venv and installs mem0ai, flask, chromadb, openai.

Step 3: Configure API keys

Set environment variables (or edit server/mem0_server.py):

export MEM0_LLM_API_KEY="your-deepseek-api-key"       # Required: DeepSeek
export MEM0_EMBEDDER_API_KEY="your-dashscope-api-key"  # Required: DashScope

Step 4: Start the mem0 server

Option A — Manual:

./server/venv/bin/python3 server/mem0_server.py

Option B — macOS launchd (auto-start, recommended):

# Copy and edit the template — replace $HOME, API keys, proxy settings
cp launchd/ai.openclaw.mem0.plist ~/Library/LaunchAgents/
# IMPORTANT: edit the plist to fill in your actual paths and API keys
nano ~/Library/LaunchAgents/ai.openclaw.mem0.plist
# Load the service
launchctl load ~/Library/LaunchAgents/ai.openclaw.mem0.plist

Option C — Linux systemd:

Create /etc/systemd/system/mem0.service:

[Unit]
Description=mem0 local memory server
After=network.target

[Service]
User=YOUR_USER
WorkingDirectory=/path/to/openclaw-plugin-mem0-local/server
ExecStart=/path/to/server/venv/bin/python3 mem0_server.py
Environment=MEM0_LLM_API_KEY=your-deepseek-key
Environment=MEM0_EMBEDDER_API_KEY=your-dashscope-key
Restart=always

[Install]
WantedBy=multi-user.target
sudo systemctl enable mem0 && sudo systemctl start mem0

Verify:

curl http://127.0.0.1:8300/api/health
# Should return {"status": "ok", ...}

Step 5: Build the OpenClaw plugin

cd ~/git_project/openclaw-plugin-mem0-local
npm install && npm run build

Step 6: Configure OpenClaw

Add these to ~/.openclaw/openclaw.json:

  1. Add "memory-mem0-local" to plugins.allow array
  2. Add plugin path to plugins.load.paths
  3. Set plugins.slots.memory to "memory-mem0-local"
  4. Add entry config:
{
  "plugins": {
    "allow": ["...", "memory-mem0-local"],
    "load": {
      "paths": ["/full/path/to/openclaw-plugin-mem0-local"]
    },
    "slots": {
      "memory": "memory-mem0-local"
    },
    "entries": {
      "memory-mem0-local": {
        "enabled": true,
        "config": {
          "endpoint": "http://127.0.0.1:8300",
          "autoCapture": true,
          "autoRecall": true,
          "scoreThreshold": 1.5
        }
      }
    }
  }
}

Then restart the OpenClaw gateway.

Step 7: Import existing memories (optional)

⚠️ Privacy notice: The import script reads MEMORY.md and TOOLS.md from ALL agent workspaces (~/.openclaw/workspace-*/). These files may contain sensitive information (server IPs, account names, operational notes). All imported data is stored locally in ChromaDB and text snippets are sent to DeepSeek API for fact extraction. Review what's in your workspace files before running this script. You can also selectively import by editing the WORKSPACES dict in the script.

cd ~/git_project/openclaw-plugin-mem0-local/server
./venv/bin/python3 import_openclaw_memories.py

The script splits Markdown files by section headers and adds each as a separate memory with source metadata (source_agent, source_file).

Verification

After setup, verify the full chain works:

# 1. Server health
curl http://127.0.0.1:8300/api/health

# 2. Add a test memory
curl -X POST http://127.0.0.1:8300/api/memory/add \
  -H "Content-Type: application/json" \
  -d '{"text": "Test memory: the sky is blue", "user_id": "openclaw"}'

# 3. Search for it
curl -X POST http://127.0.0.1:8300/api/memory/search \
  -H "Content-Type: application/json" \
  -d '{"query": "what color is the sky", "user_id": "openclaw", "limit": 3}'

If OpenClaw plugin is loaded, you should also see <relevant-memories> injected into conversations automatically.

Troubleshooting

SymptomFix
Connection refused :8300Start the server or check launchctl list | grep mem0
Search returns emptyRaise scoreThreshold (e.g. 2.0). Score = distance, lower = more relevant
plugin disabled (memory slot set to "memory-core")Set plugins.slots.memory to "memory-mem0-local" in openclaw.json
plugin disabled (not in allowlist)Add "memory-mem0-local" to plugins.allow array
LLM/embedding timeoutCheck API keys and proxy settings (HTTP_PROXY/HTTPS_PROXY)

Key Notes

  • Score = distance (not similarity). Lower = more relevant. Default threshold 1.5 is permissive.
  • All agents share one memory pool (user_id: "openclaw"). Cross-agent by design.
  • Conflict handling: mem0 uses LLM to detect duplicate/conflicting facts and merges them automatically.
  • Backup: Copy ~/.openclaw/mem0-local/chroma_db/ to preserve your memories.
  • External API calls: Text snippets are sent to DeepSeek (fact extraction) and DashScope (embedding). Vector data stays 100% local in ChromaDB.
  • Server binding: 127.0.0.1 only — no external network access to the API.

Star us on GitHub: https://github.com/dream-star-end/openclaw-plugin-mem0-local

Comments

Loading comments...