数据管道工具箱
PassAudited by VirusTotal on May 5, 2026.
Overview
Type: OpenClaw Skill Name: data-pipeline-toolkit-v2 Version: 1.0.0 The bundle consists only of metadata and documentation (SKILL.md) describing an ETL data pipeline tool. No executable code or scripts (such as the referenced 'pipeline.sh') are included in the provided files. While the documentation contains an affiliate referral link (referer.shadowai.xyz), there is no evidence of malicious intent, prompt injection, or unauthorized data access.
Findings (0)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
If a user or agent runs the examples, it may depend on an unreviewed local script or a differently named package.
The evaluated registry slug is data-pipeline-toolkit-v2, while the docs reference data-pipeline-toolkit, and the provided manifest contains no pipeline.sh helper. This is a packaging/provenance ambiguity rather than proof of malicious behavior.
clawhub install data-pipeline-toolkit ... ./pipeline.sh create my-pipeline
Verify the intended ClawHub package, source, and the contents of any pipeline.sh executable before running ETL commands.
Over-privileged database or warehouse credentials could let a pipeline read or write more data than intended.
The examples use database connection strings and can load data into external systems. That is expected for an ETL skill, but the metadata declares no required credentials or scoping guidance.
./pipeline.sh load my-pipeline postgres --connection $DATABASE_URL ... ./pipeline.sh load user-logs clickhouse --connection $CH_URL
Use dedicated least-privilege credentials, test on non-production data first, and confirm exactly which sources and destinations are used.
A scheduled pipeline could keep transferring or transforming data until disabled.
The skill explicitly supports scheduled recurring pipelines. This persistence is purpose-aligned, but scheduled jobs can continue acting after initial setup.
定时调度:Cron任务或事件触发 ... ./pipeline.sh schedule daily-sales "0 6 * * *"
Create schedules only after confirming the source, destination, frequency, owner, monitoring, and how to disable or roll back the job.
Pipeline names, error details, or operational metadata could be sent to third-party notification channels.
The monitoring examples send alerts to email or a webhook. This is expected for failure notification, but the artifact does not specify what data is included in alerts.
./pipeline.sh alert my-pipeline email --to admin@example.com ... ./pipeline.sh alert my-pipeline webhook --url "https://open.feishu.cn/..."
Review alert contents, avoid sending secrets or raw records in notifications, and use trusted webhook destinations.
