news-hot-hub

v1.0.2

新闻热点数据聚合器——整合知乎、今日头条、AIBase三大平台热搜数据。支持单独获取任一平台热榜,也支持一次性获取所有平台数据并汇总输出。当用户提到"热搜聚合"、"全平台热点"、"各平台热门话题"、"热榜整合"、"热点数据采集"、"我要看全网热点"、"刷一下各平台热榜"、"一键获取热榜"、"知乎头条热榜"、"全网...

0· 90·0 current·0 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description describe a multi-platform hot-search aggregator and the package contains hub.py plus per-platform scripts (zhihu.py, toutiao.py, aibase.py) that implement that functionality. The declared requirements (requests, beautifulsoup4, lxml) match web-scraping tasks.
Instruction Scope
SKILL.md instructs the agent to run scripts/hub.py and to optionally pip install the local requirements; hub.py dispatches to local platform scripts which perform HTTP GETs to the target sites and output JSON. Nothing in the instructions reads unrelated system files or transmits data to unknown endpoints. One minor mismatch: platform-guide.md documents an optional ZHIHU_COOKIES env var required for some advanced Zhihu commands (hot-question/hot-video/topic), but the skill's top-level environment requirements do not list it — this is an optional/advanced credential and not required for the documented default hot-search behavior.
Install Mechanism
There is no remote install spec embedded in the skill bundle; SKILL.md recommends running pip install -r requirements.txt (standard Python packages). The requirement list is small and uses well-known packages. Risk is typical for any pip install (network download of packages) but no arbitrary remote binary downloads or obscure URLs are used.
Credentials
The skill does not declare required environment variables (only an optional HOT_HUB_LIMIT is documented). The only additional env var mentioned anywhere is ZHIHU_COOKIES in platform-guide.md for specific Zhihu subcommands; that is optional and reasonable for endpoints that require login, but the skill does not list it as a required credential — user should only set such cookies if they understand the privacy implications.
Persistence & Privilege
Skill is not request/always-included (always:false). It does not modify other skills or system-wide agent settings. It invokes local scripts via subprocesses (expected for a dispatcher) and does not request persistent elevated privileges.
Assessment
This skill appears to do what it claims: it runs local Python scripts to fetch hot-search data from Zhihu, Toutiao and AIBase and outputs JSON. Before installing, consider: (1) pip install -r requirements.txt will download packages from PyPI — run in a virtualenv or sandbox. (2) The scripts make outbound HTTP requests to the listed domains (zhihu.com, toutiao.com, news.aibase.cn) — ensure outbound network access is acceptable. (3) You should only set ZHIHU_COOKIES if you understand and trust the code and are comfortable exposing your browser cookie string to the environment (that cookie is sensitive). (4) hub.py executes the bundled scripts via subprocess.run, so the bundled scripts will execute as code on the host — review scripts if you need to verify they match your policy. (5) If you require stronger isolation, run the skill inside an isolated VM/container or review/modify the scripts to remove any functionality you do not want.

Like a lobster shell, security has layers — review code before you run it.

latestvk976bjjhyr3t8m1psz0qxh6ryx83sf12

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments