suspicious.dangerous_exec
- Location
- scripts/export-to-notes.js:160
- Finding
- Shell command execution detected (child_process).
AdvisoryAudited by Static analysis on May 10, 2026.
Detected: suspicious.dangerous_exec, suspicious.dynamic_code_execution, suspicious.env_credential_access (+1 more)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
A malicious social-media post included in an export could run commands on the user's computer.
Scraped post text is placed into markdown, then interpolated into a shell command with incomplete escaping; title and tags are also interpolated. A crafted post or argument could execute local shell commands when exporting to Bear.
const tagArgs = tagList.map(t => `--tag "${t}"`).join(' '); const escaped = markdown.replace(/"/g, '\\"').replace(/\$/g, '\\$'); const command = `echo "${escaped}" | grizzly create --title "${title}" ${tagArgs}`; execSync(command, { stdio: 'inherit' });Do not use the Bear export until it is rewritten to avoid shell interpolation, such as using execFile/spawn with argument arrays and passing note content through stdin.
Exports could use unexpected account authority or send collected content to an account the user did not intend.
The exporter reads provider credentials and the static scan shows an apparent hardcoded Microsoft Graph token that would take precedence over the environment token.
const notionKey = apiKey || process.env.NOTION_API_KEY; ... accessToken: [REDACTED] || process.env.MS_GRAPH_TOKEN,
Remove any embedded tokens, require explicit user-supplied credentials, and declare Notion/Microsoft Graph credential requirements in metadata.
Using the skill as documented could violate platform rules, trigger account bans, or collect data in ways users did not intend.
The documentation goes beyond normal rate-limit guidance into evasion-style scraping practices, including residential proxy rotation and simulated human behavior.
## Anti-Bot Avoidance ... Mouse Movement Simulation ... Proxy Rotation For high-volume scraping: - Use residential proxies - Rotate every 10-20 requests
Prefer official APIs, remove proxy/evasion guidance, and stop automation when platforms block or disallow access.
Dependency installation may pull a newer compatible package version than the author tested.
The skill depends on an external package with a semver range rather than a lockfile-pinned version; this is expected for Playwright scraping but should be installed from a trusted source.
"dependencies": { "playwright": "^1.40.0" }Use a trusted package registry, review the package-lock if one is added, and pin dependency versions for reproducible installs.
A local record of scraped post identifiers or URLs may remain after the task is done.
The deduplication feature persists seen scraped posts across runs in a local file.
const seenPath = path.join(process.cwd(), '.skroller-seen.json'); ... fs.writeFileSync(seenPath, JSON.stringify(seen, null, 2));
Review or delete .skroller-seen.json when finished, and avoid storing scraped personal data longer than needed.