Back to skill
Skillv4.1.0
ClawScan security
Smart Search · ClawHub's context-aware review of the artifact, metadata, and declared behavior.
Scanner verdict
SuspiciousApr 1, 2026, 2:29 AM
- Verdict
- suspicious
- Confidence
- medium
- Model
- gpt-5-mini
- Summary
- The skill appears to implement the described multi-engine search (Exa MCP, SearX, Tavily) and the bundled scripts are plausible, but there are internal metadata inconsistencies and non-obvious privacy/network implications you should verify before installing or running code.
- Guidance
- What to check before installing/running Smart Search: - Metadata mismatch: SKILL.md mentions SEARXNG_URL and TAVILY_API_KEY while registry metadata says no env vars; confirm whether you must set any env vars for your use-case (Tavily is optional; Exa MCP requires no key). - Review ~/.openclaw/.env before running: search.sh 'sources' that file and, as a fallback, exports every key=value it contains. Only put the variables you intend (TAVILY_API_KEY, SEARXNG_URL) there — do not store unrelated secrets (AWS keys, SSH keys) in that .env. - Network/privacy: the skill will send your query text to external services (https://mcp.exa.ai and optionally https://api.tavily.com). Do not use it to search sensitive personal, corporate, or secret information unless you have a trusted local SearX instance and force queries there. - Inspect scripts yourself (search.sh, deploy-searx.sh, publish.sh). They are short and readable; if you are uncomfortable running them, run them in an isolated environment or container first. - If you need strong privacy guarantees, deploy the provided local SearX and set SEARXNG_URL to the localhost instance before using the skill. If you want, I can: - Point out the exact lines in search.sh that read .env and call remote endpoints - Summarize network calls that will occur for a given query - Suggest a safe sequence of commands to run the skill in a sandboxed/containerized environment
Review Dimensions
- Purpose & Capability
- noteThe code and scripts match the stated purpose (routing queries to Exa MCP, local SearX, and Tavily). Required binaries (curl, python3) and optional docker for SearX are reasonable for a search/aggregation skill. However the skill documentation and metadata are inconsistent: the top-level registry metadata lists no required env vars, SKILL.md's metadata lists SEARXNG_URL and TAVILY_API_KEY as required, and _meta.json marks them optional. The 'zero-config / no API key required' claim is true for Exa MCP but Tavily integration still needs a key if you want AI summaries — the messaging around 'zero-config' vs optional features is confusing and should be clarified.
- Instruction Scope
- noteRuntime instructions and scripts (search.sh, deploy-searx.sh) are narrowly scoped to building/using search results: they read ~/.openclaw/.env, call remote search endpoints, optionally deploy a local SearX instance via Docker, and output to terminal. There is no evidence the scripts read unrelated system paths (e.g., ~/.ssh, ~/.aws). Caveat: search.sh contains a fallback that exports all key=value pairs from ~/.openclaw/.env (export $(cat ~/.openclaw/.env | ... | xargs)), which will indiscriminately export any variables present in that file — review that file before running. Also note that networked queries are sent to third-party services (https://mcp.exa.ai and https://api.tavily.com) so search terms (which may include sensitive info) will be transmitted unless you force SearX/local mode.
- Install Mechanism
- okNo install spec is provided (instruction-only install), which minimizes automatic disk writes. Code files are bundled and intended to be executed by the user/agent; deploy-searx.sh uses a public Docker image (searx/searx) and writes a local settings.yml in the skill directory. No downloads from obscure hosts or URL shorteners are used by the scripts themselves, though documentation contains an optional curl example to a public GitHub raw URL. Overall install mechanism is conventional but executing bundled scripts still carries normal code-execution risk.
- Credentials
- concernRequested credentials are limited to an optional TAVILY_API_KEY (for Tavily summaries) and an optional SEARXNG_URL (for a local SearX). That is appropriate for the stated integrations. However metadata inconsistencies (some files claim no required env, SKILL.md lists them as required, _meta.json marks them optional) create confusion. The skill's behavior of sourcing ~/.openclaw/.env (and falling back to exporting every KEY=VALUE line) means any secrets you place in that file will be available to the script environment; the author asserts keys are not uploaded, and Tavily calls use the key only in the Authorization header, but you should ensure you only store the intended variables in that .env file.
- Persistence & Privilege
- okThe skill does not request 'always' privilege and does not modify other skills or system-wide configurations. deploy-searx.sh writes config under the skill directory and runs a Docker container (normal behavior for an optional local service). Nothing indicates the skill tries to gain persistent elevated privileges or to change other skills' settings.
