Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
qqbot-daily-news-briefing
v1.0.0Generates and delivers automated daily tech and finance news briefings with AI commentary via QQ, Telegram, or Discord using Baidu API or DuckDuckGo search.
⭐ 0· 34·0 current·0 all-time
by@propn
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Suspicious
high confidencePurpose & Capability
The skill's stated purpose (news aggregation + delivery) matches the scripts' behavior, but required capabilities are not declared. The code expects a Baidu search helper at {WORKSPACE}/skills/baidu-search/scripts/search.py (invoked via subprocess) even though the package/README/SKILL metadata do not declare this dependency. Scripts also assume OpenClaw CLI availability for delivery. The registry metadata lists no required env vars or credentials, yet the runtime expects BAIDU_API_KEY and target-user/channel settings. These mismatches mean the skill will fail or behave unexpectedly unless external dependencies and credentials are provided.
Instruction Scope
SKILL.md instructs adding environment variables (BAIDU_API_KEY, NEWS_TARGET_USER / QQ_TARGET_USER) and setting cron jobs and editing scripts — which is reasonable — but it does not document the required external baidu-search script path referenced at runtime. The instructions also direct writing/reading from system-wide locations (/etc/profile.d, /var/log, /root/.openclaw/workspace) and creating cron jobs; these have system-wide effects and require appropriate privileges. There is no instruction to install or verify the external 'baidu-search' skill or the OpenClaw CLI; the code will invoke those without checking for presence.
Install Mechanism
There is no install spec (instruction-only install) — the lowest install risk — and all code is included in the bundle. That reduces risk from arbitrary downloads. However, the scripts expect external artifacts (OpenClaw CLI and a separate baidu-search script under WORKSPACE/skills) which are not provided; this is an operational/consistency gap rather than a network-install risk.
Credentials
Registry metadata claims no required env vars or primary credential, but SKILL.md and code clearly use BAIDU_API_KEY and target-user variables (NEWS_TARGET_USER / QQ_TARGET_USER). The skill also writes logs to /var/log and stores files under /root/.openclaw/workspace, implying elevated privileges. Asking users to place API keys in system-wide /etc/profile.d or /etc/environment is more privileged than necessary and should be optional/explicit. The skill exposes hardcoded sample target IDs in scripts which should be removed or clearly documented.
Persistence & Privilege
always:false (normal). The skill's instructions encourage persistent scheduling via system cron and the delivery script (and news-deliver-direct.py) may schedule OpenClaw cron sessions automatically. That creates persistent scheduled behavior, which is consistent with its purpose. Still, because the scripts write to system paths (/var/log, /root workspace) and may schedule tasks, you should review and run them in a controlled environment (non-root or container) before deployment.
What to consider before installing
This skill is plausible for generating/delivering news, but several red flags need attention before installing:
- Missing declared dependencies: The generator calls a separate Baidu helper at {WORKSPACE}/skills/baidu-search/scripts/search.py which is not included or mentioned as a required package; ask the author where that comes from or provide it yourself.
- Undeclared environment variables: The registry lists none, but the code and SKILL.md require BAIDU_API_KEY and target-user envs. Treat any API key you set as sensitive.
- Privileged file locations: Scripts default to /root/.openclaw/workspace and /var/log; they will likely fail or require root. Prefer running in a dedicated non-root user or container and update WORKSPACE and log paths accordingly.
- Hardcoded sample target IDs are present in scripts; replace them with your own values before use and verify the target format.
- Persistent scheduling: The skill instructs cron setup and may add OpenClaw cron sessions; verify scheduled jobs after installation and ensure you want the automatic daily deliveries.
- OpenClaw CLI dependency: Delivery methods rely on the openclaw command. Confirm it is installed and configured and that the channels (qqbot/telegram/discord) are authorized.
Recommended next steps before using:
1) Request or locate the missing 'baidu-search' helper and update the README/SKILL metadata to declare it as a dependency. 2) Run the scripts in a sandboxed, non-root environment, updating WORKSPACE and LOG paths to user-owned directories. 3) Remove/replace hardcoded target IDs. 4) Only export BAIDU_API_KEY if you trust the code and prefer per-user (not system-wide) env config. 5) If you cannot confirm the origin of the baidu-search helper or the author, treat the skill as untrusted and avoid giving it production credentials.Like a lobster shell, security has layers — review code before you run it.
latestvk97fr1kdbrzq680pjp3y08wmks83rygh
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
