HotTrender Basic Crawler

v2.0.0

Use when users need a lightweight HotTrender crawler for four-region daily hotspot trends or custom keyword/vertical hotspot discovery. Prefer the bundled ba...

0· 91·1 current·1 all-time
byBingEdward@7487

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for 7487/hottrender-dingtalk-bot.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "HotTrender Basic Crawler" (7487/hottrender-dingtalk-bot) from ClawHub.
Skill page: https://clawhub.ai/7487/hottrender-dingtalk-bot
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install hottrender-dingtalk-bot

ClawHub CLI

Package manager switcher

npx clawhub@latest install hottrender-dingtalk-bot
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name/slug contains 'dingtalk-bot' but both SKILL.md and the runtime explicitly state that DingTalk push/publishing is excluded. This is a confusing naming artifact but not a functional mismatch—otherwise the bundled scripts and providers align with the described crawling purpose.
Instruction Scope
SKILL.md limits actions to resolving/using the bundled runtime, running the provided scripts, running tests, and making narrow code edits under src/providers or src/crawler. It does suggest searching the workspace/home for an existing checkout (find up to maxdepth 5) and installing the bundled runtime if missing; both are consistent with the stated workflow and not excessive.
Install Mechanism
There is no external download/install spec. The skill is instruction-only at install-time and includes an installer script that copies the bundled runtime from the skill package to a target directory; no remote URLs, shorteners, or extracted archives from unknown hosts are used.
Credentials
The skill declares no required environment variables. The runtime optionally reads TIKTOK_MS_TOKEN and provider-specific settings (proxies, cookies) for the TikTok provider; these are documented as optional and only needed if the user enables TikTok. Users should not supply sensitive tokens unless they intend to use TikTok scraping and have vetted the environment.
Persistence & Privilege
always:false (no forced presence). agents/openai.yaml sets allow_implicit_invocation: true (normal for skills), so the agent may call the skill autonomously under platform policies — expected behavior, not a red flag by itself.
Assessment
This skill appears to be what it claims: a small, local crawler runtime packaged with runnable scripts. Before installing or running it: 1) Run the offline tests first (configs/providers.yaml -> mode: offline and python -m pytest tests/test_basic_crawler.py) to verify behavior without network access. 2) Be cautious about enabling TikTok: it requires Playwright, network access, and optionally TIKTOK_MS_TOKEN or cookies; only provide such tokens if you understand the scraping implications. 3) Inspect configs/providers.yaml for any proxy or cookie values before populating them with secrets—do not paste production secrets into provider config files you haven't reviewed. 4) Note the naming mismatch (slug/name contains dingtalk) — the runtime deliberately excludes DingTalk push/publish features. 5) Run the runtime in a restricted virtualenv or workspace (not a privileged/system environment) and avoid exposing other credentials. 6) If you plan to modify the TikTok provider or enable Playwright, review the provider code carefully (it monkey-patches Playwright internals and may require debugging).

Like a lobster shell, security has layers — review code before you run it.

basic-crawlervk977kg711y5h2jwj4f70pjhtzn857akjdaily-trendsvk977kg711y5h2jwj4f70pjhtzn857akjhottrendervk977kg711y5h2jwj4f70pjhtzn857akjkeyword-hotspotsvk977kg711y5h2jwj4f70pjhtzn857akjlatestvk977kg711y5h2jwj4f70pjhtzn857akjvertical-hotspotsvk977kg711y5h2jwj4f70pjhtzn857akj
91downloads
0stars
4versions
Updated 1w ago
v2.0.0
MIT-0

HotTrender Basic Crawler

Core Rule

For four-region daily hotspot trends or vertical/custom-keyword hotspot discovery, use the bundled crawler runtime first. Do not reimplement platform crawling until the existing providers and scripts are checked.

Before changing code, answer these questions:

  1. Is there already a script, API, provider, doc, or test covering this need?
  2. Can the user goal be satisfied by running or configuring that capability?
  3. If not, what exact gap remains, and where is the smallest extension point?

Only edit code after that evaluation.

Repository Layout

This skill bundles a sanitized basic crawler runtime under assets/hottrender-runtime/. It does not bundle DingTalk, OSS, ActionCard, lp-ads, worker queues, databases, logs, LLM, or any secrets.

First resolve the runtime path from environment variables, the current workspace, or the bundled runtime:

HOTTRENDER_APP_DIR   # directory containing scripts/fetch_daily_trends.py

If HOTTRENDER_APP_DIR is missing, install the bundled runtime:

python assets/install_hottrender_runtime.py --target ./HotTrenderRuntime
export HOTTRENDER_APP_DIR="$PWD/HotTrenderRuntime"

If variables are missing but a local checkout may exist, discover it safely:

find "$PWD" "$HOME" -maxdepth 5 -path '*/scripts/fetch_daily_trends.py' 2>/dev/null

Fast Path

Use these references only when needed:

Operating Workflow

  1. For "四地区热点", "每日热点", "daily trends", or "jp/us/tw/kr", start from scripts/fetch_daily_trends.py.
  2. For "垂类热点", "关键词热点", "自定义关键词", or "custom keyword", start from scripts/fetch_keyword_hotspots.py.
  3. For "抓取是否有效", "平台数据不对", or "为什么没结果", inspect configs/providers.yaml, run offline mode first, then real mode.
  4. For code changes, keep the runtime basic. Do not add DingTalk, OSS, lp-ads, database, worker, or LLM features back into this skill.

Guardrails

  • Never print API keys, cookies, tokens, msToken, proxy credentials, or other secrets.
  • Do not fabricate live platform data. Offline/sample mode must be called out as offline/sample.
  • Do not introduce push, publishing, database, queue, or workspace features into this basic crawler.
  • If the user has no HotTrender checkout, use the bundled runtime installer before proposing code rewrites.
  • Keep changes scoped: provider logic in src/providers, orchestration in src/crawler.py, CLI entrypoints in scripts/.

Verification

Prefer focused verification:

cd "$HOTTRENDER_APP_DIR"
python -m pytest tests/test_basic_crawler.py -q
python scripts/fetch_daily_trends.py --config configs/providers.yaml --output out/daily_trends.md

Comments

Loading comments...