Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

M-Flow Memory

v1.0.0

基于 M-Flow 框架实现的四阶段记忆管理,支持记忆添加、索引、向量和三元组多模式搜索及记忆蒸馏。

0· 86·0 current·0 all-time
bysune@sora-mury

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for sora-mury/m-flow-memory.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "M-Flow Memory" (sora-mury/m-flow-memory) from ClawHub.
Skill page: https://clawhub.ai/sora-mury/m-flow-memory
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install m-flow-memory

ClawHub CLI

Package manager switcher

npx clawhub@latest install m-flow-memory
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
!
Purpose & Capability
Name/description (M-Flow memory) aligns with the provided client wrapper and test script; however the package expects a local 'm_flow' Python package to exist (sys.path manipulation) while the skill bundle does not include that package or an install spec. SKILL.md also documents multiple environment variables and services (ollama, LanceDB, Kuzu, MCP server) that are not declared in registry metadata. Requiring a local LLM/embedding endpoint and a specific LanceDB version is plausible for the stated purpose, but the absence of an included m_flow module or instructions to install it is inconsistent.
Instruction Scope
SKILL.md confines actions to memory add/index/search/distill workflows and instructs setting HF_HUB_OFFLINE and local endpoints (OLLAMA). The runtime instructions do not ask the agent to read unrelated system files or exfiltrate data. However, they depend on running local services (ollama endpoints) and integration with other components (knowledge-distillation skill, MCP server) which broaden operational scope and are not contained in the skill bundle.
Install Mechanism
No install spec is provided (instruction-only), which is lowest-risk in principle. But SKILL.md requires installing lancedb==0.26.0 and references m_flow>=0.3.1 and m_flow>=0.5.0 features; there is also a typographical 'uv pip install' command. The skill assumes external packages and services exist without providing a reliable or documented install mechanism, increasing friction and risk of misconfiguration.
!
Credentials
Registry metadata declares no required env vars, yet SKILL.md lists HF_HUB_OFFLINE and a detailed .env with LLM_PROVIDER, OLLAMA_BASE_URL, EMBEDDING_ENDPOINT, LLM_API_KEY, etc. The code itself sets HF_HUB_OFFLINE automatically. This mismatch (undeclared but required env settings) is a red flag: the skill expects connectivity/credentials to local LLM/embedding services but does not declare or justify those variables in metadata.
Persistence & Privilege
The skill does not request permanent/always-on inclusion and does not modify other skills or system-wide settings. It does set HF_HUB_OFFLINE in process env, but that is limited to the running process context.
What to consider before installing
This skill appears to be a wrapper around an external 'm_flow' library and a local LLM/embedding stack (ollama, LanceDB, Kuzu). Before installing or enabling it: 1) Ask the author to provide the missing m_flow package or clear install instructions and a proper install spec (pip/packaging) — the bundle currently lacks the m_flow code it imports. 2) Confirm required environment variables and secrets (OLLAMA endpoints, embedding endpoint, API keys) are documented in the registry metadata; don't supply remote credentials unless you understand where they'll be used. 3) Run the skill in an isolated environment (container) if you want to test; note the SKILL.md expects local services (localhost:11434) and a specific LanceDB version. 4) Fix/confirm typos in installation commands ('uv pip install'). 5) If you cannot obtain the upstream m_flow source, treat the skill as unready: it may fail or behave unpredictably. Overall, the package is not evidently malicious but is inconsistent and under-documented — proceed only after remediation or additional author clarification.

Like a lobster shell, security has layers — review code before you run it.

latestvk97e27m9gevqq8j9ngcypxdncs849s3z
86downloads
0stars
1versions
Updated 3w ago
v1.0.0
MIT-0

M-Flow Memory Skill

描述

基于 M-Flow 的记忆系统,提供 Karpathy 风格的 4 阶段记忆管理工作流。

M-Flow 是一个集成了 Cone Graph、向量搜索 (LanceDB) 和图数据库 (Kuzu) 的记忆框架。

激活条件

  • 用户请求记忆 distillation、memory search、记忆管理
  • 需要 Karpathy LLM Knowledge Base 工作流

核心功能

1. 记忆添加 (add)

from m_flow_memory import MFlowMemory

memory = MFlowMemory()
await memory.add("session content or knowledge point")

2. 记忆索引 (memorize)

await memory.memorize()  # 索引所有未处理的内容

3. 记忆搜索 (search)

# 全文搜索 (BM25)
results = await memory.search("query", mode="lexical")

# 向量搜索 (Cone Graph)
results = await memory.search("query", mode="episodic")

# 三元组搜索
results = await memory.search("query", mode="triplet")

M-Flow 配置要求

必需环境变量

HF_HUB_OFFLINE=1

.env 配置 (m_flow/.env)

LLM_PROVIDER="ollama"
LLM_MODEL="ollama/qwen2.5:14b-instruct-q8_0"
OLLAMA_BASE_URL="http://localhost:11434/v1"
LLM_ENDPOINT="http://localhost:11434/v1"
LLM_API_KEY="ollama"

EMBEDDING_PROVIDER="ollama"
EMBEDDING_MODEL="nomic-embed-text:latest"
EMBEDDING_ENDPOINT="http://localhost:11434/v1/embeddings"
EMBEDDING_DIMENSIONS=768

VECTOR_DB_PROVIDER=lancedb
DB_PROVIDER=sqlite
GRAPH_DATABASE_PROVIDER=kuzu

LanceDB 版本

⚠️ 重要: LanceDB 必须使用 0.26.0(0.27.1 有 bug)

uv pip install lancedb==0.26.0

Karpathy 4 阶段工作流

Phase 0: 审计与激活

  • 清点现有资源
  • 建立 KNOWLEDGE-STANDARDS.md
  • 激活 knowledge-archive collection

Phase 1: Query → Wiki 回流

  • 用户发起查询
  • 结果存入 wiki 层
  • 格式规范: source | content | tags | timestamp

Phase 2: Compile 层

  • 使用 knowledge-distillation skill
  • session → knowledge points
  • wiki → structured knowledge

Phase 3: Lint 层

  • 每周健康检查
  • 去重、合并、更新

Phase 4: M-Flow 集成

  • MCP server (需要 m_flow>=0.5.0)
  • 多 Agent 共享

目录结构

m-flow-memory/
├── SKILL.md           # 本文件
├── scripts/
│   ├── __init__.py
│   ├── client.py      # M-Flow Python 客户端封装
│   ├── distill.py     # 记忆蒸馏
│   ├── query.py       # 记忆查询
│   └── config.py      # 配置
└── docs/
    └── README.md      # 详细文档

使用示例

添加记忆

await memory.add("用户询问了 OpenClaw 的记忆系统架构")

搜索记忆

# 混合搜索
results = await memory.search(
    "OpenClaw memory system",
    mode="episodic",
    top_k=10
)

记忆蒸馏

# 从会话中提取知识要点
knowledge_points = await memory.distill(session_transcript)

已知限制

  1. M-Flow REST API 需要认证(当前配置为开发模式)
  2. Kuzu 图数据库在 Windows 上可能有文件锁定问题
  3. LanceDB 0.27.1 有 bug,必须使用 0.26.0

依赖

  • m-flow >= 0.3.1
  • lancedb == 0.26.0
  • ollama (本地运行)
  • HF_HUB_OFFLINE=1

Comments

Loading comments...