Cross-Validated Search

v16.0.0

OpenClaw skill for source-backed web search, page reading, and evidence-aware claim checking. Use it to verify factual answers with live search results and e...

0· 156·0 current·0 all-time
byDa Wei@wd041216-bit

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for wd041216-bit/cross-validated-search.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Cross-Validated Search" (wd041216-bit/cross-validated-search) from ClawHub.
Skill page: https://clawhub.ai/wd041216-bit/cross-validated-search
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install cross-validated-search

ClawHub CLI

Package manager switcher

npx clawhub@latest install cross-validated-search
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
medium confidence
Purpose & Capability
The name and description match the runtime instructions (CLI commands for search-web, browse-page, verify-claim, evidence-report). However, the SKILL.md recommends 'pip install cross-validated-search' and refers to an optional env var CROSS_VALIDATED_SEARCH_SEARXNG_URL even though the registry metadata lists no required env vars; this mismatch is minor but worth noting.
Instruction Scope
The SKILL.md confines the agent to search, page-reading, and claim-verification workflows; it does not instruct the agent to read unrelated files, access system configs, or exfiltrate data to unknown endpoints. Examples show CLI usage and --deep/--json flags; the only operational instruction that affects the environment is to install the package via pip.
Install Mechanism
There is no registry-level install spec, but the documentation instructs users/agents to run 'pip install cross-validated-search'. Installing a third-party PyPI package is common but has moderate risk—review the PyPI package and its source (the project homepage GitHub link is provided) before installing, or run in an isolated environment.
Credentials
The skill requires no credentials and declares no required env vars, which is proportional. However, SKILL.md references CROSS_VALIDATED_SEARCH_SEARXNG_URL as a recommended configuration for a self-hosted provider; that env var is not declared in metadata. Also be aware that search queries and fetched page contents will be sent to external search providers (ddgs or searxng), which may expose query contents and retrieved pages to those services.
Persistence & Privilege
The skill does not request 'always: true', does not require persistent system-wide changes, and is instruction-only with no code files in the bundle. There is no indication it modifies other skills or agent-wide settings.
Assessment
This skill appears to do what it says (search and evidence-aware verification) and does not ask for credentials, but take two precautions before using it: (1) inspect the cross-validated-search package source on the linked GitHub repo or PyPI before running 'pip install' (or install it inside an isolated environment/container), and (2) consider privacy: search queries and full page contents will be sent to whichever search provider is used (duckduckgo/ddgs by default, or a searxng instance if you set CROSS_VALIDATED_SEARCH_SEARXNG_URL). If you need confidentiality, self-host a searxng instance and set that URL; if you cannot verify the package source, avoid installing and instead use a verified browser/search integration.

Like a lobster shell, security has layers — review code before you run it.

latestvk9738g4zrwhs0srvfzr232bdfs83h0wx
156downloads
0stars
1versions
Updated 1mo ago
v16.0.0
MIT-0

Cross-Validated Search for OpenClaw

This skill gives OpenClaw a practical verification workflow:

  • search-web for live search results
  • browse-page for reading the full content of a source
  • verify-claim for support/conflict classification
  • evidence-report for a citation-ready summary with next steps

Install

pip install cross-validated-search

Minimum verification

search-web "OpenAI API pricing" --type news --timelimit w
verify-claim "Python 3.13 is the latest stable release" --deep --max-pages 2 --json
evidence-report "Python 3.13 stable release" --claim "Python 3.13 is the latest stable release" --deep --json

Recommended flow

  1. Run search-web for factual or recent questions.
  2. Use browse-page on the most relevant source when snippets are not enough.
  3. Use verify-claim when a concrete claim needs a support/conflict summary.
  4. Use evidence-report when you want a compact evidence package with citations and next steps.
  5. Use --deep when the claim matters enough to justify page-aware verification.
  6. Cite the returned URLs in the final answer.

What success looks like

  • the verdict is explicit
  • the result includes support and conflict scores
  • page_aware is true when deep verification ran
  • the recommended free path is ddgs + self-hosted searxng
  • source URLs are ready to cite

Limits

  • verify-claim is heuristic and evidence-aware, not a proof engine.
  • The default provider path is ddgs.
  • The recommended free upgrade path is self-hosted searxng via CROSS_VALIDATED_SEARCH_SEARXNG_URL.
  • Conflicting sources are surfaced, not automatically reconciled.

License

MIT License.

Comments

Loading comments...