Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
Neckr0ik Etl Builder
v1.0.0Build data pipelines for ETL (Extract, Transform, Load). Connect databases, APIs, files, and cloud storage. Transform and sync data automatically. Use when y...
⭐ 0· 243·1 current·1 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Suspicious
medium confidencePurpose & Capability
The SKILL.md advertises many connectors (Postgres, MySQL, MongoDB, S3, GCS, Google Sheets, Airtable, Notion, etc.) and a 'neckr0ik-etl-builder' CLI, but the included scripts/pipeline.py (partial view) only implements CSV, JSON, REST API, and SQLite extraction paths and exposes a python entrypoint. There is no provided 'neckr0ik-etl-builder' binary/wrapper and several referenced docs (references/connectors.md, references/transforms.md) are missing. This is an inconsistency between claimed capabilities and actual code.
Instruction Scope
SKILL.md instructs the agent/user to run commands and to supply credentials (examples use $API_TOKEN) and to connect many cloud services, but the instructions do not align with what the script implements. The docs are broad and open-ended (e.g., OAuth/service-account flows, many connectors) while the code appears limited. The instructions also reference files and docs that are not present in the package.
Install Mechanism
There is no install spec (instruction-only plus a script). That's low risk from an installer perspective. The included pipeline.py is a plain Python script — no archive downloads or external install URLs.
Credentials
The skill declares no required environment variables or primary credential, but SKILL.md example commands reference environment variables (e.g., $API_TOKEN) and describe OAuth/service-account usage. The code accepts tokens passed in pipeline configs, but the package does not proactively request or document required env vars. Also, the script will create ~/.data-pipeline and write pipeline configs and logs there (expected for an ETL tool).
Persistence & Privilege
always:false and the skill does not request elevated system privileges. The script persistently writes pipeline JSON and run logs under the user's home (~/.data-pipeline), which is reasonable for a pipeline tool but worth noting. It does not appear to modify other skills or global agent settings.
What to consider before installing
This package is inconsistent: the README promises many connectors and a custom CLI, but the shipped code appears to implement only local files, REST APIs, and SQLite (and exposes a python script rather than the advertised CLI). Before installing or running it: (1) obtain and review the full pipeline.py (the provided file appears truncated) and any missing referenced docs to confirm which connectors are implemented; (2) search the code for any hardcoded remote endpoints or unexpected network calls (loads may send data to webhooks or external services); (3) run it in a sandbox or isolated environment; (4) if you need cloud connectors (Postgres, S3, GCS, Google Sheets, etc.), prefer a well-documented, maintained tool or vendor-provided connector; and (5) avoid providing sensitive credentials until you verify the code paths that use them.Like a lobster shell, security has layers — review code before you run it.
latestvk973r458491k68fembebr906wx82ch2w
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
