Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

运营数据日报虾

v1.0.0

运营数据日报虾 — 自动化多平台运营数据采集与日报生成。从抖音、小红书、视频号、B站、微博等平台采集数据,清洗标准化后生成结构化日报,并推送到飞书文档/群聊。 **当以下情况时使用此 Skill**: (1) 需要汇总多个内容平台(抖音/小红书/视频号/B站/微博)的运营数据 (2) 需要生成运营日报、周报、月报...

0· 65·0 current·0 all-time
byRicky@tujinsama

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for tujinsama/operation-daily-report-claw.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "运营数据日报虾" (tujinsama/operation-daily-report-claw) from ClawHub.
Skill page: https://clawhub.ai/tujinsama/operation-daily-report-claw
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install operation-daily-report-claw

ClawHub CLI

Package manager switcher

npx clawhub@latest install operation-daily-report-claw
Security Scan
Capability signals
Requires OAuth token
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
!
Purpose & Capability
The skill claims to aggregate platform metrics and push reports to Feishu, which matches the included scripts. However, the registry metadata lists no required environment variables or primary credential, while the scripts and reference docs clearly expect multiple sensitive credentials (DOUYIN_ACCESS_TOKEN, XIAOHONGSHU_COOKIE, WEIXIN_CORP_ID/SECRET, BILIBILI_SESSDATA, WEIBO_ACCESS_TOKEN, etc.). That omission is an inconsistency: a legitimate aggregator should declare these requirements up front.
Instruction Scope
SKILL.md and the scripts keep to the stated workflow: ask user which platforms, read credentials from a local .env, call platform APIs (curl), normalize JSON, run report generator, then push via Feishu tools. The instructions reference reading a .env and saving files under data/raw and data/reports (expected). They also instruct use of feishu_create_doc and message tools (agent/tool integration) and scheduling via cron. Nothing in the instructions reads unrelated system config or exfiltrates to unknown endpoints.
Install Mechanism
There is no install spec (lower install risk) but the skill bundle includes executable scripts that will run on the host. The fetch script uses curl/jq/python and the report generator requires pandas/jinja2; the package does not declare dependency installation. This is not malicious but means the operator must ensure required binaries and Python packages are installed in a controlled environment.
!
Credentials
The skill requires multiple high-sensitivity credentials (OAuth tokens, cookies, SESSDATA, corp secrets) for the platforms it integrates with. The registry metadata declares no required env vars — a mismatch that hides the true secrets surface. Using cookies (e.g., XIAOHONGSHU_COOKIE or BILIBILI_SESSDATA) is inherently fragile and sensitive and should be minimized or rotated. The number and sensitivity of env vars is proportionate to the task, but they should be declared and justified in metadata.
Persistence & Privilege
The skill is user-invocable and not forced-always; it does not request elevated platform privileges or modify other skills. It reads/writes files under its own workspace (data/) and suggests optional cron scheduling, which is normal for automation. Autonomous invocation is allowed (platform default) but not combined with other red flags here.
What to consider before installing
This skill appears to implement the advertised report workflow, but the package metadata hides the real secrets it needs. Before installing: (1) insist the publisher update metadata to list required env vars (DOUYIN_*, XIAOHONGSHU_COOKIE, WEIXIN_CORP_*, BILIBILI_SESSDATA, WEIBO_*, etc.); (2) review scripts locally (they call only known platform endpoints via curl) and run them in an isolated environment or container; (3) avoid pasting long-lived full-account credentials—prefer scoped tokens where possible and rotate cookies/tokens frequently; (4) confirm how Feishu integration is performed (the skill expects external agent tools like feishu_create_doc/message); (5) if you must run on production servers, ensure data/ and .env are permission-restricted and consider limiting network egress or running behind a proxy to monitor outgoing API calls. If the publisher cannot justify the missing metadata or answer how credentials are stored/rotated, treat the skill as untrusted.

Like a lobster shell, security has layers — review code before you run it.

latestvk977avymby980jjdyy9tn2jh0584gm69
65downloads
0stars
1versions
Updated 2w ago
v1.0.0
MIT-0

运营数据日报虾

自动化多平台运营数据采集 → 清洗分析 → 日报生成 → 飞书推送。

工作流程

步骤 1:确认配置

询问用户:

  • 需要采集哪些平台(抖音/小红书/视频号/B站/微博)
  • 账号 ID 和 API 密钥是否已配置(参考 references/platform-api-guide.md
  • 日报类型:日报 / 周报 / 月报
  • 推送目标:飞书群 chat_id 或个人

步骤 2:数据采集

使用 scripts/fetch-platform-data.sh 采集数据:

# 采集单平台
./scripts/fetch-platform-data.sh fetch douyin <account_id>

# 采集所有平台(并行)
./scripts/fetch-platform-data.sh fetch all

数据输出为 JSON 格式,存储在 data/raw/<platform>_<date>.json

步骤 3:生成日报

使用 scripts/generate-report.py 生成报告:

# 生成昨日日报
python3 scripts/generate-report.py --date yesterday

# 生成指定日期
python3 scripts/generate-report.py --date 2026-03-31

# 生成周报
python3 scripts/generate-report.py --type weekly --date 2026-03-31

步骤 4:推送到飞书

  1. feishu_create_doc 创建飞书云文档(Markdown 格式)
  2. message 工具将文档链接发送到目标群聊或个人
  3. 若有异常数据,额外发送 @全体成员 的预警消息

步骤 5:配置定时任务(可选)

若用户需要每天自动执行,提供 cron 配置示例:

# 每天早上 8:00 自动生成日报
0 8 * * * cd /path/to/workspace && ./scripts/fetch-platform-data.sh fetch all && python3 scripts/generate-report.py --date yesterday

数据标准字段

所有平台数据统一标准化为:

字段说明
platform平台名称
date数据日期
views播放量/阅读量
likes点赞数
comments评论数
shares转发/分享数
favorites收藏数
new_followers当日涨粉数
total_followers粉丝总数
engagement_rate互动率((likes+comments+shares)/views)
completion_rate完播率(视频平台)

异常检测

采集完成后自动检查异常,规则详见 references/anomaly-rules.md。 发现异常时在日报中标注 ⚠️,并单独发送预警消息。

日报模板

日报格式详见 references/report-templates.md,支持:

  • 简洁版(仅核心指标)
  • 详细版(含趋势图)
  • 对比版(多平台横向对比)
  • 异常版(仅展示异常数据)

平台 API 接入

各平台 API 鉴权和调用方式详见 references/platform-api-guide.md

注意事项

  • 小红书无官方 API,需 Cookie 鉴权,稳定性较低,建议每周检查 Cookie 是否过期
  • API 调用有频率限制,建议每天采集 1-2 次
  • 确保服务器时区与平台数据时区一致(Asia/Shanghai)
  • 历史数据存储在本地 SQLite,长期使用需定期备份

Comments

Loading comments...