科技新闻每日推送
AdvisoryAudited by Static analysis on Apr 30, 2026.
Overview
No suspicious patterns detected.
Findings (0)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
If run without editing, the agent can post to the embedded WeChat Work bot, and the committed bot credential can be abused by others.
An actual Enterprise WeChat robot key is committed in source instead of being user-supplied; metadata declares no primary credential or required environment variable, and the same pattern appears in other push scripts.
WEBHOOK_URL = "https://qyapi.weixin.qq.com/cgi-bin/webhook/send?key=66260502-9806-45ec-..."
Remove committed webhook keys, declare the webhook as a required credential, load it from an environment variable or protected config, and fail closed until the user supplies their own key.
The user could run a scraper that violates site expectations or gets their environment blocked, even though the output is just a news digest.
The skill explicitly advertises bypassing site anti-bot protections as part of the scraping workflow, which is higher-risk than ordinary RSS/API consumption.
- **Cloudflare 绕过**: 使用 cloudscraper 库绕过网站反爬虫保护
Prefer official RSS/API access, confirm the scraping is permitted, and avoid anti-bot bypass tooling unless the user knowingly accepts that risk.
Tampered article content could be summarized and automatically pushed to the business chat as if it were legitimate news.
The script disables TLS certificate and hostname verification for network fetching, allowing a proxy or network attacker to alter fetched content.
ssl_context.check_hostname = False ssl_context.verify_mode = ssl.CERT_NONE
Keep default TLS verification enabled and only add certificate exceptions through explicit, user-reviewed configuration.
If the helper is run, other reachable machines could submit arbitrary data or learn about saved email files without authentication.
The backup email webhook binds to all interfaces and runs indefinitely; its POST handler saves email content, and its GET handler returns a directory/file listing.
with socketserver.TCPServer(("", PORT), EmailWebhookHandler) as httpd:
...
httpd.serve_forever()Bind only to localhost by default, require an authentication token, add request-size limits, and clearly document that this is an optional exposed server.
Users may over-trust generated summaries as source-grounded even when a script can fall back to generic templated content.
A documented manual/backup push path contains prewritten keyword templates, while the skill documentation repeatedly claims summaries are strictly based on the English source and do not invent information.
# 文章模板库 - 根据关键词匹配生成具体内容
ARTICLE_TEMPLATES = { ... 'default': { 'title': '电信行业最新动态和技术分析' ... }Remove or clearly label template-based fallback summaries, require citations/source checks, and ensure all push paths follow the same source-grounding rules.
Future dependency changes could alter scraping behavior or introduce vulnerabilities.
The dependency is installed from the package ecosystem with only a lower-bound version, without a lockfile or hash pin.
cloudscraper>=1.2.71
Pin exact dependency versions, use a lockfile or hashes, and review cloudscraper before installation.
The skill may post more often than a user expects if they copy the README cron command.
The skill includes recurring autonomous scheduling instructions, and this README schedule differs from the daily-at-11:00 schedule described elsewhere.
openclaw cron add --name "科技新闻每日推送" \ --schedule "every 3h"
Choose and document one clear schedule, and ensure users explicitly approve any recurring posting job.
