Install
openclaw skills install lumetra-engramPersistent, explainable memory for your OpenClaw agent — store facts and recall them later via the hosted Engram MCP server (by Lumetra).
openclaw skills install lumetra-engramYou have access to Engram, a hosted memory service for AI agents. Engram lets you remember facts, decisions, and context across conversations using a hybrid retrieval engine (BM25 + vector + knowledge graph) and returns an explanation trace with every recall.
query_memory first and ground your answer in the results.store_memory to capture it.store_memory.| Tool | Description |
|---|---|
store_memory(content, bucket?) | Save a fact. bucket defaults to "default". |
query_memory(question, bucket?) | Hybrid retrieval + synthesized answer with citations. |
list_memories(bucket, limit?) | List memories in a bucket, newest first (limit 1–100, default 20). |
list_buckets() | Show all buckets in the tenant. |
delete_memory(memory_id, bucket) | Delete one memory by ID. |
clear_memories(bucket) | Delete every memory in a bucket (destructive!). |
"User prefers dark mode." Bad: "The user mentioned they like dark mode, also they live in Seattle, also...""work", "personal", "project-alpha". If no bucket fits, omit it and the default bucket is used.Engram is bring-your-own-key end-to-end — inference (embeddings, synthesis, graph extraction) runs through the user's OpenAI / Anthropic / Groq / Together / Fireworks key configured at https://lumetra.io/models. Without a provider key, every store_memory and query_memory returns HTTP 412. If you see that error, tell the user to visit the models page.