Wechat Search

AdvisoryAudited by Static analysis on Apr 30, 2026.

Overview

No suspicious patterns detected.

Findings (0)

Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.

What this means

Search behavior and credential handling may depend on another local Tavily skill that is outside this package's reviewed files.

Why it was flagged

The skill delegates search to a local script from another skill path that is not included in this package. This is aligned with the Tavily search purpose, but users should ensure that helper skill/script is installed from a trusted source.

Skill content
'node', '/root/.openclaw/workspace/skills/tavily-search/scripts/search.mjs'
Recommendation

Declare the Tavily helper dependency clearly, avoid hard-coded workspace paths where possible, and use only a trusted local Tavily integration.

What this means

Your Tavily API key may be used to perform searches and may be available to the local Tavily helper process.

Why it was flagged

The code can read a local Tavily API key configuration and pass that key to the Tavily search process. This is expected for a Tavily-backed search skill, but it is credential use that users should notice.

Skill content
tavily_config_path = os.path.expanduser('~/.openclaw/tavily-config.json') ... env['TAVILY_API_KEY'] = config.get('api_key', '')
Recommendation

Use a Tavily key scoped for search, keep it separate from unrelated credentials, and make the credential/config path explicit in the skill metadata.

What this means

A user could overestimate how much the skill itself enforces scraping safeguards.

Why it was flagged

The documentation makes strong compliance assurances. The visible implementation mostly delegates fetching to external tools and does not itself clearly enforce every stated safeguard, so users should treat the claims as expectations to verify rather than guaranteed protections.

Skill content
**Respects robots.txt**: Checks and follows robots.txt directives
- **Rate limiting**: Minimum 5-second delay between requests
Recommendation

Document which layer enforces robots.txt and rate limiting, and ensure the code or tool configuration actually applies the advertised safeguards.