Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
nas-master
v1.0.0A hardware-aware, hybrid (SMB + SSH) suite for ASUSTOR NAS metadata scraping. Functions as a versatile Coder, Project Manager, and System Architect while maintaining strict read-only safety and i3-10th Gen resource throttling.
⭐ 0· 2k·0 current·0 all-time
by@afajohn
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Suspicious
medium confidencePurpose & Capability
The SKILL.md and nas_engine.py are consistent with an SMB+SSH NAS scraper: they use SSH (paramiko), walk a network path, and write metadata into MySQL. However the registry metadata claims no required env vars/binaries while SKILL.md lists many required binaries and env vars — an important mismatch. The skill also declares support for PHP/XAMPP web dashboard generation and Windows-specific throttle constants, which are coherent with the intended Windows-targeted workflow but expand the surface area beyond a simple scraper.
Instruction Scope
Instructions explicitly direct the agent to: recursively scan NAS volumes (including hidden system folders), run SSH commands (cat /proc/mdstat, btrfs scrub status), parse internal app SQLite DBs, and generate a PHP/AJAX dashboard under C:\xampp\htdocs\nas_explorer\. These actions are within the stated purpose but involve system-level reads and writing files into a webserver directory — which increases risk and should be expected and authorized by the user.
Install Mechanism
There is no install spec (instruction-only style) so nothing is automatically downloaded from external URLs. That lowers install-time risk. The package does include Python code that depends on third-party libraries (paramiko, mysql.connector, python-dotenv, psutil), but no automated installation is declared.
Credentials
SKILL.md declares many required env vars (NAS_VOLUMES, NAS_USER, NAS_PASS, NAS_SSH_HOST, NAS_SSH_USER, NAS_SSH_PASS, DB_PASS) which are appropriate for a NAS scraper. However the registry metadata lists no required env vars — a clear inconsistency. The distributed .env file in the skill package contains concrete credentials/paths (NAS_ROOT_PATH, NAS_VOLUMES, NAS_USER, NAS_PASS, NAS_SSH_HOST, etc.). Shipping credentials or filled-in connection strings inside the package is a red flag: even if placeholders, they broaden the risk surface and could be accidentally used or leaked.
Persistence & Privilege
The skill does not request always:true and does not modify other skills, which is good. However its runtime actions include creating database records and instructions to generate files under C:\xampp\htdocs\nas_explorer\ — a system/webserver location. That means the skill, when run, will write files that may be served by a webserver; users should consider whether the agent is permitted to write into that location and whether generated content is safe to host.
What to consider before installing
Before installing or running this skill, consider the following:
1) Provenance: The source is unknown and registry metadata conflicts with SKILL.md. Ask the publisher for provenance and an explanation for the metadata mismatch.
2) Credentials in package: The skill bundle includes a .env file with live-looking credentials/paths. Never run code that uses embedded credentials you didn't provision yourself—replace them with placeholders or remove the .env before use. Treat any included passwords as potentially sensitive and verify they are not legitimate production credentials.
3) System writes: The skill instructs writing a PHP dashboard into C:\xampp\htdocs\. That will place files under a webserver root. Only allow this if you explicitly want a dashboard hosted there and you trust the generated content; run in a sandbox/VM first.
4) Network & privilege: The skill needs SMB share access and SSH credentials to the NAS; ensure you supply only least-privilege accounts (read-only where possible) and review the database account (use a dedicated DB user with limited rights).
5) Dependencies & platform: The Python code uses psutil Windows constants and assumes XAMPP paths — test in a controlled Windows environment. Also confirm required Python packages (paramiko, mysql-connector, python-dotenv, psutil) are installed from trusted sources.
6) Safety checks: Validate that the scraper truly runs read-only against your shares and that any generated web files do not unintentionally expose sensitive metadata. Review and run the code in a sandbox (isolated VM or staging network) before granting it access to production NAS or network.
If you cannot confirm origin or cannot run it in an isolated environment, do not install or run the skill. If you proceed, replace/remove the bundled .env, use scoped credentials, and inspect/modify the PHP/dashboard generation to ensure it does not leak data publicly.Like a lobster shell, security has layers — review code before you run it.
latestvk971g4vts645w6v9668mdh6e71805s12
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
