Decodo Web Scraper

v1.1.0

Search Google, scrape web pages, Amazon product pages, YouTube subtitles, or Reddit (post/subreddit) using the Decodo Scraper OpenClaw Skill.

1· 608·1 current·1 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description, README, SKILL.md and the included tools/scrape.py are all aligned: the skill calls Decodo's scraping API for Google, universal URLs, Amazon, YouTube, and Reddit. No unrelated capabilities or credentials are requested by the code.
Instruction Scope
SKILL.md and the CLI script focus only on building requests to Decodo's scraper API. The runtime instructions ask the agent to set DECODO_AUTH_TOKEN or put it in a .env file; the script reads only that token and does not direct the agent to read other system files or exfiltrate additional environment variables.
Install Mechanism
This is an instruction-only skill with a small Python helper and a requirements.txt (requests, python-dotenv). There is no download-from-arbitrary-URL or install script; risk from install mechanism is low.
Credentials
The skill legitimately requires a single DECODO_AUTH_TOKEN (Basic auth) to call Decodo's API, which is proportionate. however, registry metadata at the top claims 'Required env vars: none' while SKILL.md and the code require DECODO_AUTH_TOKEN — this metadata mismatch should be fixed. Treat the token as sensitive and only provide one obtained from Decodo's dashboard.
Persistence & Privilege
The skill does not request always:true, does not modify other skills or system configs, and does not attempt privileged or persistent system changes. It uses the agent process to make outbound HTTPS calls to scraper-api.decodo.com.
Assessment
This skill appears to do what it says: it calls Decodo's scraping API and returns results. Before installing, confirm the following: 1) the DECODO_AUTH_TOKEN is required (SKILL.md and tools/scrape.py use it) even though registry metadata omitted that — only provide a token from Decodo's dashboard and store it securely (e.g., local .env, secret manager), 2) the tool makes outbound requests to https://scraper-api.decodo.com — ensure your environment/network policy allows that and that you trust Decodo, 3) scraping content may have legal/ToS implications for sites you target (Amazon, Google, Reddit, YouTube); ensure you have the right to scrape and use scraped data, and 4) if you need stronger assurance, verify the repo origin (the README points to a GitHub repo) and check that the hosted homepage and dashboard domain match your expectations before providing credentials.

Like a lobster shell, security has layers — review code before you run it.

latestvk975z0j4sdfp68hnem63p1e89981eax9

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments