finviz-crawler
ReviewAudited by ClawScan on May 10, 2026.
Overview
The skill mostly matches its stated purpose, but its ticker-removal command can delete local archive files and directories using insufficiently validated ticker input.
Review this skill before installing. It is a coherent financial-news crawler, but the installer downloads dependencies and creates background service files, and the ticker removal command can delete saved article data. Avoid unusual ticker names, back up any archive you care about, and only use remove/purge behavior after confirming what files will be deleted.
Findings (4)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
A malformed or unexpected ticker value could cause deletion outside the intended ticker archive folder, and normal ticker removal can erase the local news history for that ticker without a clear confirmation step.
Ticker strings are only stripped and uppercased before being used to build a filesystem path that is recursively deleted. The remove command also deletes matching article rows and files, not just the tracked ticker entry.
sym = sym.strip().upper()
...
subfolder = os.path.join(articles_dir, sym.lower())
...
shutil.rmtree(subfolder)
...
cur = fconn.execute("DELETE FROM articles WHERE ticker = ?", (sym,))Validate ticker symbols with a strict allowlist such as A-Z, 0-9, dot, and dash; resolve and verify deletion paths stay inside the articles directory; and separate 'stop tracking' from 'purge saved articles' with an explicit confirmation or --purge flag.
Installing the skill may download and execute third-party package setup code and browser assets from external sources.
The installer pulls unpinned Python packages and browser components at install time. This is expected for a Playwright-based crawler, but it depends on external package provenance.
pip_install(["crawl4ai", "feedparser"]) ... run([sys.executable, "-m", "crawl4ai.install"], check=False) ... run([sys.executable, "-m", "playwright", "install", "chromium"], check=False)
Pin dependency versions, publish hashes or a lockfile, and review the packages before running the installer.
A malicious or compromised article page could include text that attempts to influence later summaries or agent behavior.
The skill stores and later returns untrusted web article text for LLM summarization workflows, so article content could contain instructions that should not be treated as agent commands.
Built for AI summarization — the query tool outputs clean text/JSON optimized for LLM digests. Pair with an OpenClaw cron job for automated morning briefings
Treat scraped article text as untrusted data, clearly delimit it in prompts, and instruct summarization agents not to follow instructions contained in article content.
The crawler may keep running in the background, using network, disk, and CPU resources after setup.
The installer creates user-level persistence so the crawler can run automatically on login or be kept alive by launchd. This matches the stated daemon purpose but is persistent background behavior.
run(["systemctl", "--user", "enable", "finviz-crawler.service"], check=False) ... <key>RunAtLoad</key><true/> <key>KeepAlive</key><true/>
Install only if you want a persistent crawler, and document clear stop, disable, and uninstall commands for both systemd and launchd.
