Back to skill
Skillv1.0.0

ClawScan security

Tech Weekly Briefing · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

BenignMar 9, 2026, 4:08 PM
Verdict
benign
Confidence
high
Model
gpt-5-mini
Summary
The skill's code, instructions, and required files align with its stated purpose (aggregating RSS from listed tech sites and producing weekly briefs); it does not request secrets or surprising privileges, though it will create persistent cron jobs and runs local subprocesses (curl, blogwatcher) which you should review before enabling.
Guidance
This skill appears coherent with its stated purpose, but review a few points before enabling: 1) Inspect generate-briefing.py and setup-sources.py locally — they call subprocesses (curl and a 'blogwatcher' CLI) which will execute on your machine. 2) The SKILL.md suggests adding cron jobs; only add them if you want recurring network fetches and file writes in ~/.openclaw and /tmp. 3) setup-sources.py will run 'blogwatcher add' for many feeds — run that only if you trust what 'blogwatcher' does. 4) weekly-cron.sh contains a comment about sending to Telegram but currently only cats the report; if you plan to enable automated distribution, confirm which messaging tool will be used and that it won’t post sensitive local data. 5) Run the scripts manually first (dry run) and verify outputs and network targets before installing cron. If you want, I can highlight the exact lines in the scripts that call external binaries and files to check.

Review Dimensions

Purpose & Capability
okName/description (weekly tech briefings) match the included scripts and data: generate-briefing.py fetches RSS, filters/deduplicates, categorizes by company, and writes reports; setup-sources.py registers feeds with a local 'blogwatcher' CLI. No unrelated credentials, cloud APIs, or surprising binaries are requested.
Instruction Scope
noteSKILL.md instructs adding cron entries and running the Python scripts from the skill directory (persisting daily and weekly runs). Scripts operate only on the skill workspace (~/.openclaw/.../data) and /tmp for reports. They execute subprocesses (curl for one feed, and blogwatcher in setup) — expected for an aggregator but worth reviewing because subprocess calls run arbitrary local binaries.
Install Mechanism
okThere is no remote install/downloader in the registry metadata (instruction-only with included code files). The code does not download or extract arbitrary archives. It does rely on system binaries (curl, python3, blogwatcher) that are not installed by the skill.
Credentials
okThe skill requests no environment variables or credentials. Network access is required to fetch RSS feeds — appropriate for the purpose. The setup script interacts with a 'blogwatcher' CLI which may modify local blogwatcher state; this is proportional but should be run only if you trust that CLI.
Persistence & Privilege
noteThe skill asks the user to add cron jobs (manual step) for daily/weekly runs; always:false and no autonomous model invocation flags beyond the platform defaults. Cron entries create persistent background activity (periodic network fetches and file writes) — expected for this use case but a persistence decision the user must opt into.