Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Aliyun Dlf Manage

v1.0.0

Use when managing Alibaba Cloud Data Lake Formation (DataLake) via OpenAPI/SDK, including the user asks for DataLake catalog resource operations, configurati...

0· 9·0 current·0 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The skill is for managing Alibaba Cloud Data Lake Formation and the included script and SKILL.md use Alibaba OpenAPI endpoints and SDK patterns — that aligns with the stated purpose. However, the registry metadata claims no required credentials while the SKILL.md explicitly requires ALICLOUD_ACCESS_KEY_ID / ALICLOUD_ACCESS_KEY_SECRET and an optional ALICLOUD_REGION_ID or ~/.alibabacloud/credentials. That mismatch is an inconsistency.
Instruction Scope
Runtime instructions are focused on API discovery and calling DataLake APIs, plus writing artifacts under output/aliyun-dlf-manage/. The SKILL.md also directs the agent to prefer environment variables and read the shared credentials file (~/.alibabacloud/credentials) if present. Those file/env reads are reasonable for cloud management but should have been declared in the metadata.
Install Mechanism
No install spec (instruction-only) and the included script is a small, straightforward Python script that fetches JSON from api.aliyun.com. No downloads from untrusted hosts or archive extraction are present.
!
Credentials
The SKILL.md requires cloud credentials (AccessKey ID/Secret and optional region or shared credentials file). Those credentials are appropriate for the skill's purpose, but the registry metadata and primaryEnv do not declare them. Not declaring required secrets in metadata reduces visibility and is a security/integrity concern.
Persistence & Privilege
The skill is not marked always:true and does not request elevated persistence. It writes output artifacts to a local output/ directory as documented, which is reasonable for tooling and reproducibility.
What to consider before installing
This skill appears to do what it says (manage Alibaba Cloud DataLake) and the small Python helper only fetches official OpenAPI metadata, but there is a metadata vs. instruction mismatch you should address before use. Actions to consider: - Confirm with the publisher (or inspect SKILL.md) that the skill needs ALICLOUD_ACCESS_KEY_ID and ALICLOUD_ACCESS_KEY_SECRET and that these should be provided via environment variables or the shared credentials file. The registry should declare these as required/primaryEnv. - Provide least-privilege Alibaba credentials (a role or keypair limited to DataLake operations) rather than full-account keys. - Run the skill in an isolated/test environment first; review any artifacts under output/aliyun-dlf-manage/ for sensitive identifiers before storing them elsewhere. - Inspect ~/.alibabacloud/credentials (if present) to understand what keys the agent might pick up, and consider using temporary/ephemeral credentials. - If you need stronger assurance, ask the publisher to update the registry metadata to declare required env vars and primary credential, and to sign or host the script from a trusted location.

Like a lobster shell, security has layers — review code before you run it.

latestvk9713qyk9tevp581nsanmkf0e1841wfk

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

SKILL.md

Category: service

Data Lake Formation

Use Alibaba Cloud OpenAPI (RPC) with official SDKs or OpenAPI Explorer to manage resources for Data Lake Formation.

Workflow

  1. Confirm region, resource identifiers, and desired action.
  2. Discover API list and required parameters (see references).
  3. Call API with SDK or OpenAPI Explorer.
  4. Verify results with describe/list APIs.

AccessKey priority (must follow)

  1. Environment variables: ALICLOUD_ACCESS_KEY_ID / ALICLOUD_ACCESS_KEY_SECRET / ALICLOUD_REGION_ID Region policy: ALICLOUD_REGION_ID is an optional default. If unset, decide the most reasonable region for the task; if unclear, ask the user.
  2. Shared config file: ~/.alibabacloud/credentials

API discovery

  • Product code: DataLake
  • Default API version: 2020-07-10
  • Use OpenAPI metadata endpoints to list APIs and get schemas (see references).

High-frequency operation patterns

  1. Inventory/list: prefer List* / Describe* APIs to get current resources.
  2. Change/configure: prefer Create* / Update* / Modify* / Set* APIs for mutations.
  3. Status/troubleshoot: prefer Get* / Query* / Describe*Status APIs for diagnosis.

Minimal executable quickstart

Use metadata-first discovery before calling business APIs:

python scripts/list_openapi_meta_apis.py

Optional overrides:

python scripts/list_openapi_meta_apis.py --product-code <ProductCode> --version <Version>

The script writes API inventory artifacts under the skill output directory.

Output policy

If you need to save responses or generated artifacts, write them under: output/aliyun-dlf-manage/

Validation

mkdir -p output/aliyun-dlf-manage
for f in skills/data-lake/aliyun-dlf-manage/scripts/*.py; do
  python3 -m py_compile "$f"
done
echo "py_compile_ok" > output/aliyun-dlf-manage/validate.txt

Pass criteria: command exits 0 and output/aliyun-dlf-manage/validate.txt is generated.

Output And Evidence

  • Save artifacts, command outputs, and API response summaries under output/aliyun-dlf-manage/.
  • Include key parameters (region/resource id/time range) in evidence files for reproducibility.

Prerequisites

  • Configure least-privilege Alibaba Cloud credentials before execution.
  • Prefer environment variables: ALICLOUD_ACCESS_KEY_ID, ALICLOUD_ACCESS_KEY_SECRET, optional ALICLOUD_REGION_ID.
  • If region is unclear, ask the user before running mutating operations.

References

  • Sources: references/sources.md

Files

4 total
Select a file
Select a file to preview.

Comments

Loading comments…