do-it - 滚滚判断技能

v1.0.0

帮助地球人做人生选择的 AI 判断技能 - 你只管 do it,判断交给滚滚

0· 88·1 current·1 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
medium confidence
Purpose & Capability
Name/description (decision helper) match the included artifacts: web UI, API server, data files, and crawler scripts to collect salary/city/case data. The presence of data files and scraping scripts is consistent with a data-driven decision skill.
Instruction Scope
SKILL.md and usage docs explicitly instruct running local services (scripts/api_server.py), running crawlers (scripts/crawl_*.py, crawl_all.py), starting local web server, and opening local files. These steps are within the product scope but grant the skill broad discretion to perform network requests, scrape third‑party sites, and write/read local files — so runtime behaviour should be reviewed before execution.
Install Mechanism
No install spec is declared (instruction-only), yet the repository includes multiple runnable scripts and a package.json. There are no remote download/install steps in the bundle, which reduces remote-install risk, but executing included code will write/read local files and perform network I/O. The lack of an install spec is internally consistent but means the risk surface is in the included scripts rather than a package installer.
Credentials
Registry metadata lists no required environment variables or credentials. Documentation mentions integrating with an AI API (OpenClaw API / model) if deployed, but that is optional in docs — absence of declared required secrets is consistent for the provided local/offline usage. If you plan to deploy or wire external APIs, expect to supply API keys at that time.
Persistence & Privilege
Skill flags are default (always:false). The package does not request special platform privileges. However, the skill includes scripts that persist data (case/city/salary JSON) and a local API server; these are normal for a self-hosted web service but mean the skill can create and modify files in its workspace.
Scan Findings in Context
[no_injection_signals_detected] expected: Static pre-scan reported no injection signals. This is plausible given the repository contains straightforward API, crawler, and static web files. Absence of scan findings does not substitute for manual review of scripts that will be executed.
Assessment
This skill appears to do what it says: a data-driven decision helper with a local web UI, API server, and web-scraping scripts. Before running or deploying: 1) Inspect scripts/api_server.py and the crawl_*.py files for any unexpected network calls, hardcoded endpoints, or POST destinations; 2) Run any server or crawler in a sandboxed environment (isolated VM or container) and as a non-privileged user; 3) Be aware crawlers will fetch external websites (possible legal/robots/captcha issues) and will store data under the skill workspace; 4) If you deploy or integrate a remote AI API, supply only necessary keys and follow least-privilege practice; 5) If you want additional assurance, paste relevant portions of api_server.py or the crawler scripts here and I can do a focused review of potential exfiltration or unsafe behavior.

Like a lobster shell, security has layers — review code before you run it.

latestvk9787rnaqatgyrrhp961rvsbzn83j8xs

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments