Openclaw Memory

SuspiciousAudited by ClawScan on May 10, 2026.

Overview

The memory and search features are mostly disclosed, but the Observer can send the wrong provider API key to a remote LLM service.

Review this skill before installing if you plan to use the Observer. Prefer passing an explicit apiKey for the selected provider and avoid relying on environment fallback until the provider-specific key handling is fixed. Do not send sensitive conversations to the Observer unless you are comfortable with the chosen LLM provider processing them; the ALMA and Indexer components appear local/offline in the provided source.

Findings (3)

Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.

What this means

A provider API key in your environment could be disclosed to or attempted against the wrong LLM provider.

Why it was flagged

The same fallback key is reused for whichever provider is selected. For example, Anthropic or Gemini calls can receive an OpenAI or Anthropic environment key instead of a provider-specific key.

Skill content
const key = this.config.apiKey || process.env.OPENAI_API_KEY || process.env.ANTHROPIC_API_KEY || ''; ... 'x-api-key': key ... 'x-goog-api-key': key
Recommendation

Require provider-specific credentials, such as OPENAI_API_KEY only for OpenAI, ANTHROPIC_API_KEY only for Anthropic, and GEMINI_API_KEY or explicit apiKey for Gemini; fail closed when the matching key is missing and declare these credentials in metadata.

What this means

Conversation history may leave the local environment and be processed by the configured LLM provider.

Why it was flagged

Conversation content is assembled into a prompt and sent to remote LLM provider endpoints. This is disclosed and purpose-aligned, but it crosses a data boundary.

Skill content
const sanitized = messages.map(m => `${m.role}: ${this.sanitize(m.content)}`).join('\n'); ... fetch('https://api.openai.com/v1/chat/completions'
Recommendation

Use the Observer only for conversations you are comfortable sending to the selected provider; use ALMA or Indexer offline for sensitive local-only work.

What this means

Private facts placed in those memory files can be surfaced in search results and later agent context.

Why it was flagged

The indexer reads memory-related Markdown files from a configured workspace and stores chunks in an in-memory search index. The scope is bounded, but indexed content can be reused in later agent context.

Skill content
const memoryMd = join(this.workspace, 'MEMORY.md'); ... join(this.workspace, 'bank', 'entities'); ... const opinions = join(this.workspace, 'bank', 'opinions.md');
Recommendation

Point the workspace only at intended memory directories and avoid storing secrets or highly sensitive data in indexed memory files.