Alicloud Data Lake Dlf

Manage Alibaba Cloud Data Lake Formation (DataLake) via OpenAPI/SDK. Use whenever the user asks for DataLake catalog resource operations, configuration updat...

MIT-0 · Free to use, modify, and redistribute. No attribution required.
0 · 874 · 2 current installs · 2 all-time installs
MIT-0
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name and description match the included assets: SKILL.md documents using Alibaba Cloud OpenAPI/SDK for DataLake and the repo includes a script that fetches DataLake OpenAPI metadata from api.aliyun.com. Nothing in the files requires unrelated services or capabilities.
Instruction Scope
SKILL.md instructs the agent to use environment credentials (ALICLOUD_ACCESS_KEY_ID / SECRET / REGION) or shared config (~/.alibabacloud/credentials) and to run the provided script which fetches metadata from api.aliyun.com and writes outputs to output/alicloud-data-lake-dlf/. The instructions do not request reading unrelated system paths or exfiltrating data, but they do expect access to cloud credentials (reasonable for the purpose).
Install Mechanism
No install spec is present (instruction-only plus a small Python script). No network install or archive downloads are performed by the skill itself; the included script makes an outbound HTTPS request to a documented Alibaba API endpoint (api.aliyun.com).
Credentials
The skill's instructions legitimately require Alibaba Cloud credentials to operate. Registry metadata, however, lists no required env vars/primary credential — a minor inconsistency: the skill expects ALICLOUD_ACCESS_KEY_ID and ALICLOUD_ACCESS_KEY_SECRET (and optional ALICLOUD_REGION_ID) at runtime but does not declare them as required in the registry metadata. This is likely an authoring omission rather than malicious.
Persistence & Privilege
always is false and the skill is user-invocable; it does not request persistent installation or elevated platform privileges. It writes artifacts only under the skill output directory per SKILL.md.
Assessment
This skill appears to do what it says: it fetches DataLake OpenAPI metadata from api.aliyun.com and guides the agent to call Alibaba Cloud APIs. Before installing or running it, ensure you: (1) supply least-privilege Alibaba Cloud credentials (ALICLOUD_ACCESS_KEY_ID / ALICLOUD_ACCESS_KEY_SECRET and optionally ALICLOUD_REGION_ID) or keep a properly permissioned ~/.alibabacloud/credentials file; (2) run the included script in a safe environment since it will make HTTPS requests to api.aliyun.com and write files under output/alicloud-data-lake-dlf/; (3) confirm the agent asks for the region and user confirmation before performing any mutating operations. Note the registry metadata does not declare the env vars the SKILL.md expects — the platform may not automatically surface credential prompts, so you should provide credentials manually and review them for least privilege. If you want extra assurance, review the small Python script (it only calls api.aliyun.com) and run it in a non-production environment first.

Like a lobster shell, security has layers — review code before you run it.

Current versionv1.0.3
Download zip
latestvk976wyr0v86yjm7e5ckj3ezrt182pm1q

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

SKILL.md

Category: service

Data Lake Formation

Use Alibaba Cloud OpenAPI (RPC) with official SDKs or OpenAPI Explorer to manage resources for Data Lake Formation.

Workflow

  1. Confirm region, resource identifiers, and desired action.
  2. Discover API list and required parameters (see references).
  3. Call API with SDK or OpenAPI Explorer.
  4. Verify results with describe/list APIs.

AccessKey priority (must follow)

  1. Environment variables: ALICLOUD_ACCESS_KEY_ID / ALICLOUD_ACCESS_KEY_SECRET / ALICLOUD_REGION_ID Region policy: ALICLOUD_REGION_ID is an optional default. If unset, decide the most reasonable region for the task; if unclear, ask the user.
  2. Shared config file: ~/.alibabacloud/credentials

API discovery

  • Product code: DataLake
  • Default API version: 2020-07-10
  • Use OpenAPI metadata endpoints to list APIs and get schemas (see references).

High-frequency operation patterns

  1. Inventory/list: prefer List* / Describe* APIs to get current resources.
  2. Change/configure: prefer Create* / Update* / Modify* / Set* APIs for mutations.
  3. Status/troubleshoot: prefer Get* / Query* / Describe*Status APIs for diagnosis.

Minimal executable quickstart

Use metadata-first discovery before calling business APIs:

python scripts/list_openapi_meta_apis.py

Optional overrides:

python scripts/list_openapi_meta_apis.py --product-code <ProductCode> --version <Version>

The script writes API inventory artifacts under the skill output directory.

Output policy

If you need to save responses or generated artifacts, write them under: output/alicloud-data-lake-dlf/

Validation

mkdir -p output/alicloud-data-lake-dlf
for f in skills/data-lake/alicloud-data-lake-dlf/scripts/*.py; do
  python3 -m py_compile "$f"
done
echo "py_compile_ok" > output/alicloud-data-lake-dlf/validate.txt

Pass criteria: command exits 0 and output/alicloud-data-lake-dlf/validate.txt is generated.

Output And Evidence

  • Save artifacts, command outputs, and API response summaries under output/alicloud-data-lake-dlf/.
  • Include key parameters (region/resource id/time range) in evidence files for reproducibility.

Prerequisites

  • Configure least-privilege Alibaba Cloud credentials before execution.
  • Prefer environment variables: ALICLOUD_ACCESS_KEY_ID, ALICLOUD_ACCESS_KEY_SECRET, optional ALICLOUD_REGION_ID.
  • If region is unclear, ask the user before running mutating operations.

References

  • Sources: references/sources.md

Files

4 total
Select a file
Select a file to preview.

Comments

Loading comments…