Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Salesforce AI Agentforce Observability

v1.0.0

Agentforce session tracing extraction and analysis. TRIGGER when: user extracts STDM data from Data Cloud, analyzes agent session traces, debugs agent conver...

0· 6·0 current·0 all-time
byAnush DSouza@dsouza-anush
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
Capability signals
CryptoRequires walletCan make purchasesRequires OAuth token
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
Name/description match the included artifacts: SQL query templates, Parquet/Polars analysis scripts, Data Cloud client and JWT-based auth references. The code and docs all target Data Cloud STDM extraction and Polars-based analysis, so capabilities align with the stated purpose. However, the registry metadata claims no required credentials or env vars while the README and SKILL.md explicitly require JWT/ECA auth, certificate files, and Data Cloud scopes—this missing declaration is inconsistent.
Instruction Scope
SKILL.md and README instruct the agent to run the bundled Python scripts to extract and analyze Parquet files and to use JWT Bearer auth for Data Cloud. The runtime instructions and hooks reference local paths, metadata files, and Parquet contents (e.g., ./stdm_data, ~/.sf/jwt). I see no instructions that exfiltrate data to unknown external endpoints in the provided content, but the code will call Salesforce Data Cloud APIs (expected). Hooks read local Parquet/metadata and may suggest further actions or cross-skill handoffs. The instructions are not overly broad, but they assume access to org credentials and local files that are not declared in the registry.
!
Install Mechanism
There is no install spec despite a sizeable Python codebase and explicit dependency list (polars, pyarrow, pyjwt, cryptography, httpx, click, rich, pydantic). This means the skill would require manual dependency installation. No downloads from arbitrary URLs were found, but absence of an install mechanism and packaging information increases friction and risk (users might run the scripts without a controlled environment).
!
Credentials
The skill requires JWT Bearer auth (private key / cert) and Data Cloud scopes (cdp_query_api, cdp_profile_api) per README/SKILL.md and uses local cert paths (examples under ~/.sf/jwt). Yet the registry lists no required env vars, no primary credential, and no required config paths. That mismatch is significant: the code clearly needs sensitive org credentials / keys but the metadata omits them. The hooks/scripts also read parquet files and metadata from disk. Requesting org-level Data Cloud access is proportionate to the stated purpose, but the omission from declared requirements is a red flag.
Persistence & Privilege
always:false (good). The agent interface allows implicit invocation (allow_implicit_invocation: true), which is normal for skills. Hooks (post-tool-use scripts) will run based on tool results and can read local extraction artifacts; they do not appear to modify other skills or global agent settings. Because the skill requires org credentials, autonomous invocation combined with those credentials could increase blast radius — a consideration for operators.
What to consider before installing
This skill appears to implement the observability functionality it promises, but the package is poorly packaged: it includes many Python scripts and explicit auth requirements (JWT/cert and Data Cloud scopes) yet the registry metadata declares no credentials or install steps. Before installing or running: - Review scripts/auth.py and scripts/datacloud_client.py to confirm exactly which credentials, files, and endpoints are used and that all network calls target Salesforce Data Cloud endpoints. - Prepare a dedicated, least-privilege connected app / JWT keypair for this purpose (scopes: cdp_query_api, cdp_profile_api). Do NOT use high-privilege or unrelated org credentials. - Run the code in an isolated environment (virtualenv/container) and install only the declared dependencies from CREDITS.md via pip to avoid supply-chain surprises. - Because the package omits declared env vars, check whether the code reads credentials from unexpected locations (env vars, home directory paths). Ensure private keys/certs are stored with appropriate permissions and not world-readable. - If you allow the agent to invoke the skill autonomously, restrict which principals can trigger it (or require user confirmation) because autonomous execution + org credentials increases risk. If you want me to, I can inspect the full contents of scripts/auth.py and scripts/datacloud_client.py to identify exactly where credentials are loaded and which remote endpoints are called; that would materially increase confidence in this assessment.

Like a lobster shell, security has layers — review code before you run it.

latestvk97csex9tg8tm4rrpz9smw21td84g4gqsalesforcevk97csex9tg8tm4rrpz9smw21td84g4gq

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments