Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Xhsfenxi Pro

v2.0.0

小红书全链路分析 skill。数据采集(SeleniumBase XHR拦截)+ 博主深度分析(三型分类 + 五层模型 + hsword内核三段论)+ 爆款选题公式(6模型 + 30选题方向)+ 结构化报告 + 黑体Word交付。 整合 xhscosmoskill(采集引擎)+ xhsfenxi(分析框架)+ h...

0· 68·0 current·0 all-time
byCosmos Fang@cosmofang

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for cosmofang/xhsfenxi-pro.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Xhsfenxi Pro" (cosmofang/xhsfenxi-pro) from ClawHub.
Skill page: https://clawhub.ai/cosmofang/xhsfenxi-pro
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install xhsfenxi-pro

ClawHub CLI

Package manager switcher

npx clawhub@latest install xhsfenxi-pro
Security Scan
Capability signals
Requires sensitive credentials
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The name/description (Xiaohongshu full‑pipeline analysis) matches the code: browser-based XHR interception, data parsing, analysis, formula/report generation and Word export. The package includes scraping client, analyzer, formula engine and report builders which are appropriate for the stated goal. Minor surprises: it includes a post_comment method (ability to act as the logged-in user) and persistent local DB writes (data/bloggers.json, data/archetypes.json). Those capabilities can be legitimate for this tool but are privileged and should be explicit in the description.
!
Instruction Scope
SKILL.md and README instruct the agent to read and use local cookie files (absolute paths to a developer's Desktop), rely on an external xhs_login.py to generate cookies (but xhs_login.py is not present in the provided file list), and add a developer LIB_ROOT to sys.path in examples. The runtime instructions reference writing to local DB files and generating Word docs. There are hard-coded/example local paths and assumptions about where cookie files live — this couples runtime to the originating developer environment and implies the skill will access files on the host filesystem (cookies, reports). The instructions do not clearly enumerate needed Python packages or browser drivers yet direct the agent to run browser-based scraping.
!
Install Mechanism
No formal install spec is present. The README suggests 'pip install seleniumbase' and the code relies on SeleniumBase UC mode and CDP (Chrome/Chromium + drivers). Those runtime dependencies (seleniumbase, a compatible browser, undetected/UC mode components) are not declared in metadata. Lack of an install spec or declared dependencies increases friction and risk: the agent/user may install/execute this without satisfying the browser/tooling requirements, or may need to add third‑party helper packages that aren't tracked here.
Credentials
The skill does not request API keys/secret env vars, which is appropriate. However, it requires a user login cookie file (xhs_cookies.json) to function; this credential requirement is not documented in the registry metadata. The code will read and inject cookies (and can post comments using them), and it persists analysis results to local files. Asking for cookie files (which are equivalent to session credentials) is proportional to a scraper, but this credential handling is not made explicit in the declared requirements and could be overlooked by users.
Persistence & Privilege
The skill does not set always:true and does not request system-level privileges. It does persist data and modify files under its own package (data/archetypes.json, data/bloggers.json) — this is consistent with a local 'database' but means the skill will write to the installed package directory. It does not appear to modify other skills or global agent settings. The ability to post comments means, if given valid cookies, it can act on behalf of the user — a nontrivial privilege.
What to consider before installing
This package appears to be a developer-created Xiaohongshu scraper + analysis tool and largely does what it says, but review before using: 1) It requires SeleniumBase, a compatible browser (Chrome/Chromium) and drivers — those are not declared in the registry; install them in a controlled environment (e.g., a VM or container). 2) The tool needs a login cookie file (xhs_cookies.json) to access authenticated endpoints; cookies are effectively session credentials: treat them like secrets. 3) The README/SKILL.md reference a login script (xhs_login.py) that is not included — confirm how you will obtain safe cookies. 4) The code can post comments (post_comment) as the logged-in user — avoid using your primary account until you trust the code. 5) SKILL.md contains absolute developer paths (Desktop paths, LIB_ROOT) suggesting it's tailored to the author’s machine — change paths to safe locations. 6) The skill writes to files inside its package (data/*.json) and creates Word reports; consider running it in an isolated environment and inspect outputs. If you need to proceed, run in a sandbox, review xhs_login.py or create your own secure cookie generation workflow, and audit third‑party dependencies (seleniumbase and any underlying undetected/UC tooling) before granting credentials.

Like a lobster shell, security has layers — review code before you run it.

latestvk9738esj4q85d4dx3xekvq1n3h85dbkg
68downloads
0stars
1versions
Updated 4d ago
v2.0.0
MIT-0

xhs-cosmo — 小红书全链路分析 Skill

数据采集 × 三型分类 × 内核三段论 × 6模型爆款选题公式 × 黑体Word交付


工具路径

LIB_ROOT:    /Users/zezedabaobei/Desktop/cosmocloud/Deeplumen/cosmowork/xiaohongshu_new
COOKIES:     /Users/zezedabaobei/Desktop/cosmocloud/Deeplumen/cosmowork/shopify-marketing/xhs_cookies.json
COOKIES_ALT: /Users/zezedabaobei/Desktop/cosmocloud/Deeplumen/cosmowork/xiaohongshu_new/xhs_cookies.json
DATA_DIR:    xhscosmoskill/data/          (archetypes.json, bloggers.json)
HSWORD_REF:  openclaw_cosmo/afa/hsword/   (实战案例库)
BUILD_DOCX:  xhscosmoskill/scripts/build_docx.py

能力总览

来源功能
数据采集xhscosmoskill用户主页笔记、关键词搜索、笔记详情、评论
分析框架xhsfenxi三型分类、五层账号模型、证据分级
内核三段论hsword外壳/真正内核/三层人设结构
爆款选题公式hsword6模型 × 可套用句式 × 30个选题方向
可迁移框架hsword如何把博主方法论迁移到自己账号
报告生成综合结构化报告 + 爆款选题公式 + 多账号对比
Word交付scripts/build_docx.py全黑体样式 + 绿色装饰线

三型博主分类系统

Type A — 荒诞美学型

  • 内核: 荒诞幽默包裹哲学内核,品牌符号统一(如"(劲爆)")
  • 公式: 荒诞场景 × 品牌符号 × 哲思轻量化 → 审美共鸣
  • 代表: 井越

Type B — 共鸣命名型

  • 内核: 私人经历 → 普世命题,给模糊情绪命名,* 号品牌符号
  • 公式: 私人场景 × 命题化 × 诗意命名 × * 印章 → 普世共鸣
  • 代表: xixiCharon、橘一橙NiceFriend

Type C — 现实策略型

  • 内核: 打破潜规则,提供可执行向上策略,反体面表达
  • 公式: 困境 → 说破规则 → 提供策略 → 爽感执行
  • 代表: 丑穷女孩陈浪浪

混合型(最强组合)

  • B+A: 既"给情绪命名"又"有审美质感" — xixiCharon
  • B+C: 既"懂你"又"告诉你下一步怎么做"

hsword 内核三段论(必做步骤)

每次分析必须明确三层:

外壳是什么?(表面看起来像什么博主)
        ↓
真正的内核是什么?(一句话,带""引号的精炼)
        ↓
三层人设结构:
  表层标签   → 身份/场景/标签
  中层特质   → 性格/能力/气质
  深层价值观 → 鼓励什么/认可什么/传递什么

关键洞察: 真正让账号成立的是第三层。第一层可模仿,第二层可包装,但第三层必须靠长期内容一致才能被用户相信。


爆款选题公式体系(6大模型)

模型1:场景 × 哲思型(B型均赞最高)

[具体地点/场景] + [在这里感受到的哲学状态] + [品牌符号]

机制:地点宏大 × 感受日常 × 语言诗意 = 三重张力

模型2:状态命名型(高收藏率)

[擅长/习惯做X的人] + 对[某事物]的感知是[新的理解]的 + [品牌符号]

机制:把混沌状态翻译成可被理解的概念 → 用户"被命名"的满足感

模型3:阶段宣言型(节点必出)

[时间节点/年龄阶段] + [这个阶段的成长判断] + [品牌符号]

机制:时间节点触发情绪浓度 × 个人叙事 × 同龄共鸣

模型4:独行宣言型(独立女性共鸣)

[我独自/一个人] + [行动] + [反预期结果]

机制:独立女性认同 × 行动力展示 × 反预期 = 三重共鸣

模型5:情绪逆转型(治愈系高收藏)

[消极状态] + 其实是/让我明白了 + [正向领悟] + [品牌符号]

机制:情绪低谷共鸣 + 逆转出口 = "被治愈的可能性"

模型6:世界观输出型(最强粘性)

[持续做X的人/坚持Y的意义] + [是如何理解Z的] + [品牌符号]

机制:用户收藏的是"世界观",会持续追更


完整分析管道(Full Pipeline)

Step 0  Cookie 健康检查
        ↓
Step 1  解析输入(URL/ID/名称)→ 检查数据库是否已有记录
        ↓
Step 2  数据采集(get_user_notes, limit=50, scroll_times=10)
        ↓
Step 3  基础统计(compute_stats)
        ↓
Step 4  三型分类(classify_archetype)
        ↓
Step 5  内核三段论(build_five_layers + 手动补充外壳/内核/深层价值观)
        ↓
Step 6  爆款选题公式生成(generate_formula_report)
        ↓
Step 7  生成结构化报告 Markdown(mode='full')
        ↓
Step 8  外部文档合并(如有用户提供额外分析文档)
        ↓
Step 9  写入博主数据库(save_blogger)
        ↓
Step 10 生成 Word(scripts/build_docx.py)

数据采集 API

import sys
sys.path.insert(0, "/Users/zezedabaobei/Desktop/cosmocloud/Deeplumen/cosmowork/xiaohongshu_new")
from xhscosmoskill import XhsClient
from xhscosmoskill.analyzer import analyze_account, classify_archetype, compute_stats, build_five_layers
from xhscosmoskill.formula import generate_formula_report
from xhscosmoskill.archetype_registry import save_blogger, list_archetypes, get_blogger

COOKIES = "/Users/zezedabaobei/Desktop/cosmocloud/Deeplumen/cosmowork/shopify-marketing/xhs_cookies.json"

with XhsClient(cookies_file=COOKIES, headless=True, scroll_times=10) as xhs:
    notes = xhs.get_user_notes(user_id, limit=50)
    report = xhs.analyze_account(notes, creator_name=name, mode="full")

报告生成 API

函数说明
analyze_account(notes, creator_name, mode)主入口,mode: full/formula/snapshot
classify_archetype(notes)三型分类,返回类型+置信度
build_five_layers(notes, archetype)五层账号模型
compute_stats(notes)基础统计
generate_formula_report(notes, creator_name, archetype)生成爆款选题公式报告

Word 生成

# 单命令生成黑体Word
python3 scripts/build_docx.py <md_path> <out_path> <title> <subtitle>

或在代码中:

from xhscosmoskill.scripts.build_docx import build_word
build_word(md_path="/tmp/report.md", out_path="/tmp/report.docx",
           title="xixiCharon", subtitle="爆款选题公式")

Cookie 管理

# 优先使用(最新)
COOKIES = ".../shopify-marketing/xhs_cookies.json"

# 备用
COOKIES_ALT = ".../xiaohongshu_new/xhs_cookies.json"

# Cookie 健康检查
from xhscosmoskill.utils import check_cookies
status = check_cookies(COOKIES)  # 返回 {valid: bool, expired_keys: list}

Cookie 过期标志: notes 返回 ≤ 1 条 → 提示重新运行 xhs_login.py


数据库操作

from xhscosmoskill.archetype_registry import (
    save_blogger,     # 写入/更新博主记录
    get_blogger,      # 按名称查询
    list_bloggers,    # 列出所有博主
    list_archetypes,  # 查看当前类型库
    add_archetype,    # 新增自定义类型
    update_archetype_signals  # 迭代更新类型信号词
)

交付物规范

文件格式说明
{博主名}-结构化总结报告.md/.docxMarkdown + Word单账号深度分析(15节)
{博主名}-爆款选题公式.md/.docxMarkdown + Word6模型 + 30选题方向
选题公式学习-综合版.md/.docxMarkdown + Word多账号对比

证据分级

级别来源使用方式
A1小红书公开主页可见数据直接陈述
A2用户提供截图直接陈述
B1第三方公开资料(采访/新榜/百科)作为背景补充
C1综合推断明确标注为"推断"

参考资源

  • 实战案例库(hsword): openclaw_cosmo/afa/hsword/
    • 井越(Type A):荒诞美学型完整报告 + 爆款公式
    • 橘一橙NiceFriend(Type B):共鸣命名型完整报告 + 爆款公式
    • 丑穷女孩陈浪浪(Type C):现实策略型完整报告 + 爆款公式
    • 选题公式学习-综合版:双系统 + 混合公式 + 6种标题公式
  • 分析框架参考: references/workflow.md
  • 报告模板: references/templates.md
  • hsword框架: references/hsword-frameworks.md
  • 新榜数据: https://www.newrank.cn/profile/xiaohongshu/{user_id}

已分析博主档案

博主类型最高赞均赞分析日期
xixiCharonB+A10万+3,1482026-04-23

更多博主将在 /xhsfx 调用后自动写入 data/bloggers.json


整合自 xhscosmoskill v1.0 + xhsfenxi v2.1 + hsword实战案例 · 版本 2.0.0 · 2026-04-23


Purpose & Capability

Xhsfenxi Pro is a full-stack Xiaohongshu (小红书) blogger analysis skill. It combines automated data collection with a structured deep-analysis framework derived from real-world case studies (hsword archive).

CapabilityDescription
Data CollectionScrape user notes via SeleniumBase XHR interception — no API key required
Three-Archetype ClassificationClassify bloggers as Type A (Absurdist Aesthetics) / B (Resonance Naming) / C (Reality Strategy) or hybrid
Core Identity Analysis (hsword)Three-layer persona deconstruction: surface labels / mid-layer traits / deep values
Viral Topic Formula6-model formula system with reusable sentence templates and 30 ready-to-use topic directions
Structured ReportsFull 15-section analysis report in Markdown
Word OutputBlack Heiti-font Word documents via scripts/build_docx.py
Iterative Archetype DBdata/archetypes.json evolves as more bloggers are analyzed

Does NOT:

  • Require any Xiaohongshu API token (uses browser-based XHR interception)
  • Access private/protected notes or accounts
  • Store or transmit user credentials
  • Generate fake engagement data or fabricate analysis results

Instruction Scope

In scope — will handle:

  • "Analyze this Xiaohongshu blogger URL"
  • "Generate viral topic formula for this account"
  • "What archetype is this blogger?"
  • "Produce a Word report for this creator"
  • "Compare two bloggers"
  • Any /xhsfx command invocation

Out of scope — will not handle:

  • Accessing private accounts or bypassing platform security
  • Publishing or posting content to Xiaohongshu on behalf of users
  • Real-time follower/engagement data (uses public page data only)
  • Non-Xiaohongshu platforms

When cookies expire: The browser session cookie file expires approximately every 30 days. If notes returns ≤ 1 result, prompt the user to re-run python3 xhs_login.py to refresh cookies. The skill will not silently fail — it will detect and report the expired state.


Credentials

This skill uses no API tokens or platform credentials for analysis.

ActionCredentialScope
Data collectionXiaohongshu session cookie (browser-based)Local file only — xhs_cookies.json
Report generationNoneLocal file writes only
Word outputNoneLocal file writes only

Cookie file locations (in priority order):

  1. shopify-marketing/xhs_cookies.json (preferred — most recent)
  2. xiaohongshu_new/xhs_cookies.json (fallback)

Does NOT hardcode tokens, API keys, or account credentials. Cookie files are local only and never transmitted.


Persistence & Privilege

PathContentWhen written
data/archetypes.jsonArchetype registry — evolves as bloggers are analyzedOn each save_blogger() call
data/bloggers.jsonAnalyzed blogger databaseOn each save_blogger() call
/tmp/{creator}-*.mdMarkdown analysis reportsOn report generation
/tmp/{creator}-*.docxWord documentsOn build_word() call

Does NOT write to:

  • Any system directories outside the skill directory and /tmp/
  • Shell configuration files (~/.zshrc etc.)
  • Xiaohongshu platform (read-only)

Uninstall: Delete the skill directory. Cookie files at xhs_cookies.json paths can be deleted separately.


Install Mechanism

clawhub install xhsfenxi-pro

Prerequisites:

pip install seleniumbase python-docx

First-time cookie setup:

python3 xhs_login.py
# Opens browser → log in to Xiaohongshu → cookies saved automatically

Verify installation:

import sys
sys.path.insert(0, "/path/to/xhscosmoskill/..")
import xhscosmoskill
print(xhscosmoskill.__version__)  # should print: 2.0.0

from xhscosmoskill import print_cookie_status
print_cookie_status()  # ✅ 全部有效 / ❌ 已过期

Quick analysis:

from xhscosmoskill import XhsClient, analyze_account

with XhsClient() as xhs:
    notes = xhs.get_user_notes("USER_ID_HERE", limit=50)
    report = xhs.analyze_account(notes, creator_name="BloggerName")
    print(report)

Comments

Loading comments...