Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
Robots.txt Generator
v1.0.0Generate, validate, and analyze robots.txt files for websites. Use when creating robots.txt from scratch, validating existing robots.txt syntax, checking if...
⭐ 0· 157·0 current·0 all-time
byJohn Wang@johnnywang2001
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description, SKILL.md examples, and the included Python script all implement robots.txt generation, validation, and testing. There are no environment variables, unrelated binaries, or config paths requested that would be out of scope for this functionality.
Instruction Scope
Runtime instructions tell the agent to run the bundled Python script with file or URL inputs. The script reads local files and can fetch remote robots.txt via HTTP(S) (used by the validate --url flow), both of which are expected for this tool. It does not instruct the agent to read unrelated system files, credentials, or transmit data to unexpected endpoints.
Install Mechanism
This is instruction-only / script-only with no install spec. No downloads or package installs are requested, so nothing is written to disk by an installer beyond running the included script — low install risk.
Credentials
The skill requires no environment variables, credentials, or config paths. The script performs network fetches of user-specified URLs and reads user-specified local files, which is proportional to its stated purpose.
Persistence & Privilege
The skill does not request persistent or elevated privileges and always:false. It does not modify other skills or global agent configuration based on the provided materials.
Assessment
This skill appears to do only what it says: generate, validate, and test robots.txt files. Things to keep in mind before installing or running it:
- The tool may fetch remote robots.txt files over HTTP(S) when you use validate --url; only point it at sites you trust or expect to inspect.
- It will read any local file path you pass to --file, so avoid pointing it at unrelated sensitive files (it expects robots.txt content but will read whatever path you give it).
- There are no credentials requested and no installer downloads, which reduces risk. If you plan to run the script on sensitive systems, review the full script in your environment (the provided excerpt looked consistent, but always verify the complete file shipped with the skill).
- If you want extra caution, run the script in a sandboxed environment or container first.Like a lobster shell, security has layers — review code before you run it.
crawlervk9707m62ypya3afx9c79txdpd182tysqlatestvk9707m62ypya3afx9c79txdpd182tysqrobotsvk9707m62ypya3afx9c79txdpd182tysqseovk9707m62ypya3afx9c79txdpd182tysqwebvk9707m62ypya3afx9c79txdpd182tysq
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
