Install
openclaw skills install deep-memoryOne-click clone of a production-grade semantic memory system: HOT/WARM/COLD tiered storage + Qdrant vector DB + Neo4j graph DB + qwen3-embedding. Enables cro...
openclaw skills install deep-memoryA production-grade semantic memory system for AI agents. Combines tiered file storage with vector search and graph relationships.
┌─────────────────────────────────────┐
│ File Layer (always-on) │
│ HOT / WARM / COLD Markdown files │
│ semantic_memory.json │
└──────────────┬──────────────────────┘
↓
┌─────────────────────────────────────┐
│ Vector Layer (Docker) │
│ Qdrant: semantic similarity search │
│ Collection: semantic_memories │
│ Dimensions: 4096 (qwen3-embedding) │
└──────────────┬──────────────────────┘
↓
┌─────────────────────────────────────┐
│ Graph Layer (Docker) │
│ Neo4j: entity relationship memory │
│ Constraints: Memory.key + Entity.id │
└─────────────────────────────────────┘
↓
┌─────────────────────────────────────┐
│ Embedding Model (Ollama) │
│ qwen3-embedding:8b (4096 dims) │
│ Local, free, no API calls │
└─────────────────────────────────────┘
brew install ollama on macOS)python3 ~/.openclaw/workspace/skills/deep-memory/scripts/setup.py
from deep_memory import MemorySystem
mem = MemorySystem()
mem.store("user_sir", "Sir prefers direct communication, no pleasantries", tags=["preference", "communication"])
results = mem.search("how does Sir like to communicate?", top_k=5)
for r in results:
print(r['content'], r['score'])
results = mem.joint_query("investment strategy", entity="Sir", top_k=3)
When triggered, the setup script will:
In your SOUL.md or AGENTS.md, add:
## Memory Retrieval
Before answering questions about prior work, decisions, or preferences:
1. Run: python3 ~/.openclaw/workspace/.lib/qdrant_memory.py search "<query>"
2. Combine with memory_search tool results
3. Use top results as context