Back to skill
Skillv1.4.0

ClawScan security

Lead Researcher · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

BenignApr 25, 2026, 5:08 PM
Verdict
benign
Confidence
high
Model
gpt-5-mini
Summary
The skill's code and instructions are consistent with its stated purpose (passive HTTP-based company enrichment) and it does not request unrelated credentials or installation steps.
Guidance
This skill appears to do what it says: passive HTTP-based enrichment of company domains and optional querying of a third-party news API (Tavily). Before installing or running it: 1) Be aware it will make outbound HTTP(S) requests to whatever domains you provide (including optional third-party API calls if you set TAVILY_API_KEY). 2) Review and confirm the code sections not shown here to ensure there are no hidden telemetry endpoints or unintended network calls; the UA string references edgeiq.dev which will appear in target server logs. 3) The README/SKILL.md asks you to respect robots.txt/terms of service, but the script does not appear to enforce robots.txt automatically — only run lookups you are authorized to perform. 4) If you plan to provide real contact lists or run bulk lookups, run the script in an isolated environment or with rate-limiting to avoid accidental abuse. 5) The package owner is unknown; if you require trust guarantees, obtain the code from a verified source or run a security review before use.

Review Dimensions

Purpose & Capability
okName/description (lead enrichment, tech fingerprinting, news, contacts) align with the included code and README. The module only uses standard libraries and makes HTTP requests to target domains and optional third-party APIs (Tavily) which are appropriate for this functionality.
Instruction Scope
noteSKILL.md instructs running the Python script and optionally providing a TAVILY_API_KEY; it explicitly recommends respecting robots.txt and scraping ToS. The code performs passive HTTP requests to target domains and looks up common paths/headers. I did not find any instructions or code that read unrelated local files, secrets, or system configuration. Note: the documentation recommends respecting robots.txt, but I did not see explicit enforcement of robots.txt checks in the visible code — the script will perform HTTP GETs to pages/paths unless additional checks are present in truncated sections.
Install Mechanism
okThis is an instruction-only skill with no install spec and a single Python script relying only on the standard library. No external downloads, package installs, or unusual install locations are present.
Credentials
okNo required environment variables or credentials are declared. An optional TAVILY_API_KEY may be provided for improved news results; that is reasonable and documented. There are no other secret-like env vars or config paths requested.
Persistence & Privilege
okThe skill is not marked always:true and does not request persistent system-wide privileges. It runs as a normal user-mode script and writes outputs only when asked (e.g., CSV export).