Ai Research Scraper
PassAudited by VirusTotal on May 12, 2026.
Overview
Type: OpenClaw Skill Name: ai-research-scraper Version: 1.8.14 The skill is designed to scrape AI research information, primarily by invoking the local 'tavily-search' OpenClaw skill via `subprocess.run` in `scripts/scraper.py`. While there are several functional bugs and inconsistencies (e.g., `scraper.py` does not read `references/websites.txt` as advertised, and some test scripts attempt to import a non-existent `translate_text` function), these do not indicate malicious intent or significant security vulnerabilities. All external network calls are for legitimate translation or search APIs, with API keys correctly shown as placeholders. There is no evidence of data exfiltration, malicious execution, persistence, or prompt injection against the agent.
Findings (0)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
The results and network behavior depend on the separate tavily-search skill; if that skill is missing, changed, or configured differently, this skill may behave differently than expected.
The main scraper depends on a hard-coded script from another skill and Node. That dependency is outside this package's shown source/install spec, although it is aligned with the stated search purpose.
subprocess.run([ 'node', '/root/.openclaw/workspace/skills/tavily-search/scripts/search.mjs', 'AI product development', '-n', '10', '--topic', 'news' ], ...)
Review and install the tavily-search dependency separately, confirm any credentials it uses, and prefer clearly declared/pinned dependencies.
If you provide API keys, those keys may grant access to your provider account or quota for the related service.
The reference docs mention third-party API keys for optional provider integrations. This is expected for search/translation services, and the supplied code shows placeholders rather than hardcoded real secrets.
Google Cloud Translation API需要以下配置:- API密钥 ... Tavily Search API需要以下配置:- API密钥
Use scoped, revocable keys; avoid pasting real secrets directly into scripts; and check provider permissions before enabling optional integrations.
Editing websites.txt or passing the documented options may not actually limit sources, date range, topic, or output size in the shown script.
The documentation advertises configurable sites and command-line options, while the supplied main scraper uses a fixed Tavily news query and does not parse these options. This is a capability/scoping mismatch rather than harmful behavior.
编辑 `references/websites.txt` 文件,添加或删除目标网站 ... scraper.py --max-tokens 500 ... --days 7 ... --topic product-development
Treat the documentation as incomplete; inspect or update the script if you need strict source scoping, token limits, date filters, or topic controls.
