Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

OpenClaw行业情报官

v1.0.0

行业情报官 - 定时采集 GitHub Trending、X(Twitter)、知乎、36kr、掘金等平台热点,AI 摘要后推送到指定渠道。集成 fxtwitter API 和 RSSHub。

0· 97·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for muisenice/openclaw-intelligence-officer.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "OpenClaw行业情报官" (muisenice/openclaw-intelligence-officer) from ClawHub.
Skill page: https://clawhub.ai/muisenice/openclaw-intelligence-officer
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install openclaw-intelligence-officer

ClawHub CLI

Package manager switcher

npx clawhub@latest install openclaw-intelligence-officer
Security Scan
VirusTotalVirusTotal
Pending
View report →
OpenClawOpenClaw
Suspicious
high confidence
Purpose & Capability
The stated purpose (periodic collection from GitHub, X/fxtwitter, Zhihu, RSS sources, AI summarization, and pushing to webhooks/email) is coherent with the runtime instructions. However the skill metadata claims no required binaries or env vars while SKILL.md clearly expects tools (curl, python3, xmllint, openclaw CLI, fxtwitter) and many environment variables (webhooks, SMTP creds, FXTWITTER_API_TOKEN). That mismatch is unexpected and suggests the metadata is incomplete.
!
Instruction Scope
SKILL.md instructs the agent to fetch external feeds, parse HTML/RSS, call fxtwitter API, generate AI summaries, write cache/logs/summaries under memory/intelligence, and push to external endpoints (Feishu/DingTalk/Telegram/SMTP). These actions are consistent with the stated purpose, but they require network access, filesystem writes, and credentials. The instructions assume an 'openclaw' CLI and local directories—neither are declared in the metadata. The skill does not request unrelated secrets, but it does require sensitive push/SMPP/SMTP tokens and will persist sent content locally.
Install Mechanism
This is an instruction-only skill with no install spec or code files, so nothing will be automatically downloaded or written by the skill itself. That reduces installation risk, but also increases the need for accurate metadata since the skill expects external binaries and an environment that may not exist.
!
Credentials
SKILL.md lists multiple sensitive environment variables (FXTWITTER_API_TOKEN, FEISHU_WEBHOOK, DINGTALK_WEBHOOK, TELEGRAM_BOT_TOKEN/CHAT_ID, SMTP credentials) which are reasonable given multi-channel push capability. However the registry metadata lists no required env vars—this discrepancy is a proportionality/visibility problem. Before installing, confirm only minimal, dedicated credentials are used (e.g., bot tokens with limited scope, a disposable SMTP account).
Persistence & Privilege
The skill does not request always:true and will not force inclusion in all runs. It instructs storing cache, logs, and summaries under memory/intelligence which is normal for this use case. Scheduling (openclaw cron add) is part of its design but assumes the platform CLI supports cron management.
What to consider before installing
This skill appears to do what it says (collect feeds, summarize, push), but the SKILL.md and the registry metadata disagree. Before installing: 1) Confirm that the host has the required tools (curl, python3, xmllint, the 'openclaw' CLI or equivalent) and that those requirements are declared by the skill author. 2) Provide only dedicated, minimal-scope credentials (separate bot/webhook tokens and a disposable SMTP account) and avoid using high-privilege or personal credentials. 3) Verify where data will be written (memory/intelligence/*) and ensure you are comfortable with local persistence of scraped content and logs. 4) Confirm the legal/terms-of-service implications of scraping each source (GitHub/X/Zhihu) and respect rate limits. 5) Ask the publisher to update the registry metadata to list required binaries and env vars so you can make an informed install decision. If you cannot verify these points, run the skill in a sandboxed environment or decline installation.

Like a lobster shell, security has layers — review code before you run it.

intelligencevk973n2xr5gbyq7p6wwdqwskays83hhdblatestvk973n2xr5gbyq7p6wwdqwskays83hhdbmonitoringvk973n2xr5gbyq7p6wwdqwskays83hhdbnewsvk973n2xr5gbyq7p6wwdqwskays83hhdbopenclawvk973n2xr5gbyq7p6wwdqwskays83hhdb
97downloads
0stars
1versions
Updated 1mo ago
v1.0.0
MIT-0

OpenClaw Intelligence Officer v1.0

行业情报官 - 自动化的行业情报采集与推送系统

功能

  • 🤖 自动采集: GitHub Trending、X (Twitter)、知乎、36kr、掘金等
  • 📝 AI 摘要: 提取关键信息,生成摘要
  • 🔔 多渠道推送: 飞书、钉钉、Telegram、邮件
  • 定时任务: 可配置采集频率

数据源

优先级 P0

平台采集方式频率建议
GitHub TrendingRSS / API每日 2 次
X (Twitter)fxtwitter API每日 2 次
知乎热榜RSSHub每日 2 次

优先级 P1

平台采集方式频率建议
36krRSS每日 1 次
掘金RSS每日 1 次
微博热搜RSSHub每日 1 次
微信读书榜RSSHub每周 1 次

配置

环境变量

# 必填
FXTWITTER_API_TOKEN=your_token  # 可选,无则用公开接口

# 推送配置 (至少配置一个)
FEISHU_WEBHOOK=your_webhook
DINGTALK_WEBHOOK=your_webhook
TELEGRAM_BOT_TOKEN=your_token
TELEGRAM_CHAT_ID=your_chat_id
SMTP_HOST=smtp.example.com
SMTP_PORT=587
SMTP_USER=your_email
SMTP_PASS=your_password
TO_EMAIL=target@example.com

采集源配置

memory/intelligence/sources.yaml 中配置:

sources:
  - name: github-trending
    url: https://github.com/trending?since=weekly
    enabled: true
    priority: p0
    tags: [AI, 开源]
    
  - name: x-ai-news
    query: "AI OR LLM OR Agent"
    enabled: true
    priority: p0
    tags: [AI, 大模型]
    
  - name: zhihu-hot
    url: https://www.zhihu.com/hot
    enabled: true
    priority: p0
    tags: [科技, 热点]

采集流程

定时触发 (Cron)
    │
    ▼
获取采集源列表 (按优先级)
    │
    ▼
并行抓取各平台数据
    │
    ├── GitHub → 解析 trending 页面
    ├── X → fxtwitter API
    └── RSS → 解析 XML
    │
    ▼
AI 摘要生成
    │
    ├── 提取标题、链接、描述
    ├── 生成一句话摘要
    └── 分类打标签
    │
    ▼
格式化输出
    │
    ├── Markdown 格式
    └── HTML 格式
    │
    ▼
推送到目标渠道
    │
    ├── 飞书 Webhook
    ├── 钉钉 Webhook
    └── Telegram Bot

输出格式

飞书/钉钉卡片

## 📊 行业情报 {日期}

### 🔥 GitHub Trending
1. **[项目名]** - ⭐1234
   描述文本...
   Tags: #AI #开源

2. **[项目名]** - ⭐987
   ...

### 🐦 X 热点
1. @username: 推文内容...
   链接

### 📰 知乎热榜
1. 问题标题
   热度和链接

目录结构

memory/intelligence/
├── sources.yaml       # 采集源配置
├── cache/            # 缓存已推送内容 (去重)
│   └── sent.json
├── logs/            # 采集日志
│   └── 2026-03-24.md
└── summaries/       # 生成的摘要
    └── 2026-03-24.md

使用方式

手动触发采集

# 采集并推送到默认渠道
openclaw intelligence fetch

# 仅采集不推送 (预览)
openclaw intelligence fetch --dry-run

# 指定采集源
openclaw intelligence fetch --source github-trending

配置定时任务

# 每日 9:00 和 15:00 采集
openclaw cron add --schedule "0 9,15 * * *" --task intelligence

查看历史

# 查看今日情报
cat memory/intelligence/summaries/2026-03-24.md

# 查看指定日期
openclaw intelligence history --date 2026-03-23

依赖

  • curl - HTTP 请求
  • python3 - 解析和摘要
  • xmllint - RSS 解析 (可选)
  • fxtwitter - Twitter/X 数据采集

扩展

添加新数据源

sources.yaml 添加:

- name: my-source
  url: https://example.com/feed
  type: rss  # 或 html, api
  enabled: true
  priority: p1
  tags: [自定义]

添加新推送渠道

实现对应的 Webhook 发送函数:

def send_to_dingtalk(webhook, message):
    # 实现钉钉推送
    pass

def send_to_telegram(token, chat_id, message):
    # 实现 Telegram 推送
    pass

设计原则

  1. 去重优先: 同一内容 24 小时内不重复推送
  2. 优先级: P0 每日 2 次,P1 每日 1 次
  3. 失败容忍: 单个源失败不影响其他源
  4. 可预览: 支持 dry-run 先看再发
  5. 可追溯: 所有历史记录本地保存

相关项目

Comments

Loading comments...