Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
AlphaPai 评论抓取
v0.2.0登录 Alpha派并抓取最近 N 小时点评,保存原文、结构化归档并建立本地索引;也可以用精确检索、向量检索或混合检索查询最近 N 天的历史点评库并生成手机友好摘要,可选发送到飞书。
⭐ 0· 149·0 current·0 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Suspicious
medium confidencePurpose & Capability
The name/description (scrape AlphaPai, index and summarize comments) aligns with the code: Playwright-based scraping, SQLite+FTS5 indexing, optional local vector index and Feishu posting. However, the package declares no required env vars/binaries while the SKILL.md and code repeatedly reference USER_AUTH_TOKEN, cookies files, local Chrome profile storage_state and rely on local CLIs (openclaw, clawhub). This omission (no declared credentials/deps) is an incoherence the user should be aware of.
Instruction Scope
Runtime instructions explicitly direct the agent/user to read/save tokens, cookies, storage_state, and optionally reuse the local Chrome Profile; these are sensitive but consistent with a site-login scraper. The skill also offers a bootstrap routine that opens a real browser and saves storage state/cookies. There is no instruction to exfiltrate secrets, but the feature to send summaries to an external Feishu webhook (if configured) means collected content could be transmitted externally — the webhook is optional and disabled by default.
Install Mechanism
The registry lists no install spec, yet the bundle includes many runnable Python scripts that import Playwright, chromadb, sentence_transformers/torch and call local CLIs (openclaw, clawhub). There is no packaged dependency list or guidance in SKILL.md about installing these third-party libraries or the CLIs. That mismatch (runnable code without declared install steps) increases the risk of runtime surprises and hidden dependency installation.
Credentials
Although the skill doesn't declare required env vars in metadata, the code and SKILL.md expect/encourage providing sensitive auth material: USER_AUTH_TOKEN (env or token file), cookies.json, account username/password, storage_state, and even access to a local Chrome profile. Those are proportionate to a login-based scraper but are sensitive; the skill also supports configuring a Feishu webhook that would send summaries off-machine. The lack of declared required-env metadata and omission of explicit warnings in metadata is a red flag.
Persistence & Privilege
The skill does not request always:true nor modify other skills. It writes archives, indexes, and runtime metadata to a local directory (~/.openclaw/data/alphapai-scraper by default) and can produce a sanitized dist for publishing. Allowing autonomous invocation is enabled in the agent interface metadata (allow_implicit_invocation), which is normal; combine this with the above credential access only if you are concerned about autonomous scraping of protected accounts.
What to consider before installing
What to consider before installing:
- This bundle contains runnable Python scripts (Playwright scraping, SQLite + optional Chroma/transformer vector steps) but the registry entry lists no install steps or dependencies. Expect to need to install Playwright, chromadb/Chroma client, sentence-transformers (and possibly torch), and have the 'openclaw' / 'clawhub' CLIs available. Ask the publisher for an explicit requirements/install list or run it in an isolated VM/venv.
- The skill asks for/uses sensitive auth artifacts: USER_AUTH_TOKEN (env or token file), cookies, username/password, storage_state, or direct access to your Chrome profile. These are necessary for automated login, but only provide them if you trust the code and are comfortable with those credentials being used locally. Prefer using short-lived tokens or manual bootstrap rather than handing over full browser profiles.
- Feishu webhook support will send summaries to an external endpoint if enabled. Ensure webhook_url is correct and intentionally configured; keep feishu.enabled=false if you do not want external transmission.
- The package writes archives and indexes to ~/.openclaw/data/alphapai-scraper by default. Review/relocate that path if you prefer a sandboxed location.
- Because code is included, inspect setup.sh (provided) and the import/use of Playwright and model-loading logic before running. If you plan to publish or run this on shared systems, run it in an isolated environment and review the package_skill.py behavior so you do not accidentally publish secrets.
- If you want to proceed safely: request from the author a clear requirements.txt / install instructions and a justification for any external CLIs required, or run the skill in a disposable container/machine after auditing the scripts.Like a lobster shell, security has layers — review code before you run it.
latestvk97098aqgb2p7hps8dby5ad96x833crk
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
