Back to skill
Skillv1.0.0

ClawScan security

Neckr0ik Etl Builder · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

ReviewMar 6, 2026, 11:52 PM
Verdict
Review
Confidence
medium
Model
gpt-5-mini
Summary
The skill's README/CLI claims and examples do not match the included implementation and documentation, which suggests either incomplete/poor packaging or misleading/misdescribed behavior — proceed with caution and verify the code before use.
Guidance
This package is inconsistent: the README promises many connectors and a custom CLI, but the shipped code appears to implement only local files, REST APIs, and SQLite (and exposes a python script rather than the advertised CLI). Before installing or running it: (1) obtain and review the full pipeline.py (the provided file appears truncated) and any missing referenced docs to confirm which connectors are implemented; (2) search the code for any hardcoded remote endpoints or unexpected network calls (loads may send data to webhooks or external services); (3) run it in a sandbox or isolated environment; (4) if you need cloud connectors (Postgres, S3, GCS, Google Sheets, etc.), prefer a well-documented, maintained tool or vendor-provided connector; and (5) avoid providing sensitive credentials until you verify the code paths that use them.

Review Dimensions

Purpose & Capability
concernThe SKILL.md advertises many connectors (Postgres, MySQL, MongoDB, S3, GCS, Google Sheets, Airtable, Notion, etc.) and a 'neckr0ik-etl-builder' CLI, but the included scripts/pipeline.py (partial view) only implements CSV, JSON, REST API, and SQLite extraction paths and exposes a python entrypoint. There is no provided 'neckr0ik-etl-builder' binary/wrapper and several referenced docs (references/connectors.md, references/transforms.md) are missing. This is an inconsistency between claimed capabilities and actual code.
Instruction Scope
concernSKILL.md instructs the agent/user to run commands and to supply credentials (examples use $API_TOKEN) and to connect many cloud services, but the instructions do not align with what the script implements. The docs are broad and open-ended (e.g., OAuth/service-account flows, many connectors) while the code appears limited. The instructions also reference files and docs that are not present in the package.
Install Mechanism
okThere is no install spec (instruction-only plus a script). That's low risk from an installer perspective. The included pipeline.py is a plain Python script — no archive downloads or external install URLs.
Credentials
noteThe skill declares no required environment variables or primary credential, but SKILL.md example commands reference environment variables (e.g., $API_TOKEN) and describe OAuth/service-account usage. The code accepts tokens passed in pipeline configs, but the package does not proactively request or document required env vars. Also, the script will create ~/.data-pipeline and write pipeline configs and logs there (expected for an ETL tool).
Persistence & Privilege
okalways:false and the skill does not request elevated system privileges. The script persistently writes pipeline JSON and run logs under the user's home (~/.data-pipeline), which is reasonable for a pipeline tool but worth noting. It does not appear to modify other skills or global agent settings.