hawk-memory-v2
Analysis
This appears to be a legitimate memory-management skill, but it persistently captures and reuses agent context and may route recalled or captured memory through optional LLM providers, so users should review its scope carefully before installing.
Findings (6)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
Checks for instructions or behavior that redirect the agent, misuse tools, execute unexpected code, cascade across systems, exploit user trust, or continue outside the intended task.
pip install lancedb ... openclaw plugins install memory-lancedb-pro@beta
The documentation recommends installing an unpinned Python dependency and a beta plugin outside the provided install spec.
零外部依赖,零 API Key ... 需要先设置 `OPENAI_API_KEY`(可选,用于 embedding)
The artifacts emphasize no API keys and local operation, but later document optional API keys and external provider usage; this is disclosed, but the headline wording could cause users to underestimate data-flow choices.
Checks whether tool use, credentials, dependencies, identity, account access, or inter-agent boundaries are broader than the stated purpose.
os.environ["OPENAI_API_KEY"] = "sk-xxx" ... os.environ["MINIMAX_API_KEY"] = "sk-cp-xxx"
The skill documents optional provider credentials even though registry metadata declares no required env vars or primary credential.
Checks for exposed credentials, poisoned memory or context, unclear communication boundaries, or sensitive data that could leave the user's control.
HawkContext ... 自动实现:- autoRecall:对话开始时自动检索相关记忆并注入上下文 - autoCapture:对话结束时自动提取记忆存入 LanceDB
The skill automatically retrieves stored memories into future model context and automatically captures new memories, creating persistent state that can carry sensitive data or untrusted instructions across sessions.
"content": "Full original content" ... "tier": "working|short|long|archive"
The documented structured memory format stores full original content and keeps it in persistent memory tiers, including long-term and archive layers.
provider="minimax" ... api_key="sk-cp-xxx" ... 支持 provider:minimax | openai | groq | ollama | keyword
The wrapper can use external model providers while autoRecall inserts memory into LLM messages, so recalled or captured memory may be processed outside the local environment when those providers are selected.
