Back to skill
Skillv1.0.0

ClawScan security

Data Transfer Optimizer · ClawHub's context-aware review of the artifact, metadata, and declared behavior.

Scanner verdict

SuspiciousMar 1, 2026, 5:09 AM
Verdict
suspicious
Confidence
medium
Model
gpt-5-mini
Summary
The skill's stated goal (analyzing AWS data-transfer costs and producing Terraform) requires access to AWS billing/traffic data, but the instructions and metadata do not request credentials or explain how that data will be provided—this mismatch is concerning.
Guidance
This skill wants to analyze your AWS data-transfer costs and produce Terraform, but it gives no instructions for how the agent will get the needed billing/traffic data. Before installing or using it: 1) Don’t provide long-lived AWS root credentials. Prefer one of two approaches: (a) provide a limited, read-only IAM role or temporary STS credentials scoped to Cost Explorer, CloudWatch Logs/VPC Flow Logs, S3 access logs, and any S3 buckets holding logs; or (b) export and sanitize cost and traffic reports yourself (CSV/Parquet) and paste only the minimal datasets the skill needs. 2) Ask the skill author to explicitly list the AWS APIs, CLI commands, or exact data files it needs and the minimal IAM policy required. 3) If you let the skill produce 'ready-to-apply' Terraform, review all generated code before applying—verify it does not create broad network or IAM changes. 4) If the author cannot clarify how data is accessed, treat the skill as untrustworthy and avoid giving it credentials or raw logs. Providing those clarifications (which APIs/data sources and an explicit least-privilege IAM policy) would make this assessment more confident.

Review Dimensions

Purpose & Capability
concernThe skill claims to break down AWS transfer costs, identify traffic patterns, and generate Terraform VPC endpoint configs. Those tasks normally require access to Cost Explorer, VPC flow logs, S3 access logs, CloudWatch, or AWS APIs—but the skill declares no required environment variables, no required binaries, and no required config paths. It is unclear how the skill expects to obtain the necessary AWS data.
Instruction Scope
concernSKILL.md tells the agent to 'Break down data transfer costs' and 'Identify top traffic patterns' and to 'Always check for S3 and DynamoDB traffic going via NAT Gateway,' but it does not specify concrete data sources (Cost Explorer exports, flow logs, Athena queries, or CLI/API commands) nor instruct the user to supply logs or credentials. The instructions are high-level and grant broad discretion without clear boundaries for what data the agent may access or require from the user.
Install Mechanism
okThis is an instruction-only skill with no install spec and no code files, so it does not write code to disk or fetch remote binaries. That lowers install risk.
Credentials
concernNo environment variables, primary credential, or IAM scope are declared despite the need to read AWS billing and network telemetry. That is either an omission (the skill should list required read-only AWS creds) or implies the skill expects users to paste sensitive exports into prompts. Both possibilities deserve scrutiny.
Persistence & Privilege
okThe skill is not configured as always:true and does not request persistent installation-level privileges. Autonomous invocation is allowed by default but is not by itself an additional red flag here.