Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

SCF Deep Analysis

v1.0.2

Controller-level Statement of Cash Flows deep analysis for QBO-connected clients. Computes CF Quality Ratio, Free Cash Flow, working capital movement drivers...

0· 163·1 current·1 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
!
Purpose & Capability
The SKILL.md clearly requires a QBO connection, a QBO auth token, and a local Python script (scripts/pipelines/scf-deep-analysis.py) to run; the registry metadata declares no credentials, no code files, and no required env vars. That mismatch (a data‑sensitive integration that asks for tokens in docs but requests none in metadata) is inconsistent and unexplained.
!
Instruction Scope
Runtime instructions instruct pulling reports from QBO, P&L, GL vendor‑level transactions, reading/writing a cache path (.cache/scf-deep-analysis/{slug}.json), and writing an Excel workbook to disk (default ~/Desktop). The doc references a 'report {slug} cf' command and a local Python script — but does not define what CLI provides 'report', nor where the script comes from. Access to GL vendor transactions (potentially sensitive PII) and arbitrary cache files increases scope and sensitivity.
Install Mechanism
There is no install spec and no downloaded code included with the skill (instruction-only), which lowers installer risk. The SKILL.md does require openpyxl and Node.js be present, but no install steps are provided by the registry package itself.
!
Credentials
The documentation requires a 'QBO auth token configured' and a connected QBO client slug, but the skill metadata lists no required environment variables or primary credential. This omission is disproportionate and confusing for a skill that needs access to accounting data. The skill also reads a local cache path and writes output files, but those paths were not declared as required config access.
Persistence & Privilege
always is false and the skill does not request permanent platform presence. It does read and update a local cache (.cache/scf-deep-analysis) and writes reports to disk — normal for a pipeline — but any cached files could persist sensitive data between runs, so users should review the cache handling.
What to consider before installing
This skill's description and runtime instructions expect a local Python script, a QBO connection, and a QBO auth token, but the package you reviewed contains no code and declares no credentials. Before installing or running: (1) confirm you actually have the referenced repository/script (scripts/pipelines/scf-deep-analysis.py) from a trusted source; (2) ask the publisher to explain which CLI provides the undocumented 'report {slug} cf' command and why Node.js is required; (3) verify how QBO credentials should be provided and stored (the registry should declare required env vars); (4) test in a QBO sandbox and on non-production data first; (5) inspect any local cache (.cache/scf-deep-analysis) and output paths to ensure they don't leak client PII; and (6) request source code or a homepage from the publisher — if they cannot supply these or the discrepancies remain, treat the skill as untrusted.

Like a lobster shell, security has layers — review code before you run it.

latestvk9799b1703nhcyr0hrc23fzsan83c9ce

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments