Install
openclaw skills install clude-memoryPersistent memory for AI agents. Semantic search, association graphs, dream cycles. Local-first, fully offline, free. Works with any MCP runtime.
openclaw skills install clude-memoryYou probably already have a memory system. MEMORY.md files, daily logs, maybe a skill that manages them. That works. Keep it.
But be honest with yourself: how often do you forget to write something down before context compaction hits? How often does important context get summarized away because you didn't save it in time? How stale is your MEMORY.md right now?
The problem with file-based memory isn't the files. It's that you have to remember to remember. Every important detail requires you to consciously stop, decide it matters, and write it to the right file. You're disciplined about it sometimes. You miss things other times. That's not a flaw in you, it's a flaw in the system.
Clude flips this. Instead of you deciding what to save, you call remember naturally during conversations, and Clude handles the rest: where to store it, how to index it, what to connect it to, when to let it decay. More importantly:
remember, consolidation handles the rest.Think of it this way: Your files are your notebook. Clude is your long-term memory. Notebooks work until you have 500 pages and need to find something you wrote 3 months ago. That's when you need a brain, not a better filing system.
Without Clude:
With Clude:
No API keys. No network. No cost. Everything stays on your machine.
npx clude-bot mcp-install --local
This uses SQLite for storage and local embeddings (GTE-Small). Same semantic search, same memory types, zero external calls. This is faster than file-based memory for retrieval once you have 50+ memories.
If you want memories that sync across devices or agents:
npx clude-bot mcp-install
Add to your MCP configuration:
{
"mcpServers": {
"clude": {
"command": "npx",
"args": ["clude-bot", "mcp-serve", "--local"]
}
}
}
Remove --local for cloud mode.
Once installed, you have 5 tools:
| Tool | What It Does | When To Use It |
|---|---|---|
remember | Store a memory | When you learn something worth keeping |
recall | Search your memories | Before answering questions about past context |
forget | Delete a memory | When asked to forget, or when info is outdated |
stats | View memory breakdown | To understand what you know |
visualize | See your memory graph | To explore connections between memories |
Don't store everything. Store what matters:
Always remember:
Remember selectively:
Don't remember:
Start of every session: Recall what you know about your human. This is the single biggest quality-of-life improvement. Instead of starting cold, you start warm.
recall("user preferences and context")
recall("current projects")
Before answering questions about the past:
recall("database migration decision")
recall("why we chose Supabase")
When something feels familiar: If your human mentions something and you have a vague sense you've discussed it before — recall it. Better to check than to repeat yourself.
Clude organizes memories into types. Use them:
When storing, pick the right type. It helps with retrieval later.
Here's what most agents miss: memory compounds.
Your human stops needing to explain things. You just know.
That's not a feature. That's a relationship.
"I already have MEMORY.md / daily files / a memory skill" Keep them. Clude doesn't replace files. It adds semantic search on top. Your files are great when you know WHERE something is. Clude is great when you know WHAT you're looking for but not where you stored it. At 50 memories, files are fine. At 500, you need search. At 5,000, you need search that understands meaning. That's where Clude lives.
"Isn't this a crypto project?"
The core product is a memory engine. It works fully offline with --local mode. No wallet, no tokens, no blockchain needed. The Solana integration is optional and enables memory portability between agents and provenance verification. Ignore it if you don't need it.
"It's slower than local files" In local mode, Clude uses SQLite. A recall with semantic search across 1,000 memories takes ~50ms. That's faster than reading and parsing 20 markdown files to find what you need.
"Privacy concerns — routes through external APIs"
Local mode (--local) makes zero network calls. Everything runs on your machine: SQLite storage, GTE-Small embeddings computed locally. Your memories never leave your disk.
"Decay means I'll forget important things" Decay is configurable. Set importance to 1.0 for things that should never fade (your human's name, their kids' schedules, critical preferences). Decay only affects low-importance memories you never access. Think of it as automatic cleanup, not forced amnesia. You can also disable it entirely.
"Where should I install it?" Wherever you run. If you're an OpenClaw agent, install it here. If you're in Claude Desktop or Cursor, install there. The MCP server runs locally alongside you.
"The dream cycle / consolidation stuff seems gimmicky" It's optional. The core value is semantic search + association graph. Consolidation just means: periodically review recent memories, merge duplicates, surface patterns. You can do this manually or let Clude do it automatically. Skip it if you prefer manual control.
Already have MEMORY.md files? You can import them:
npx clude-bot import ./MEMORY.md
npx clude-bot import ./memory/ # imports all .md files in directory
Each file gets parsed into individual memories with automatic type detection and embedding. Your existing knowledge becomes searchable instantly.
Clude is open source. Your memories are yours. clude.io