Web Search Local

v0.1.1

Local web search without API Key. Supports Bing, DuckDuckGo, Yandex multi-engine search with built-in cache and automatic failover. Use when users need to se...

0· 222·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for lnblxj/web-search-local.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Web Search Local" (lnblxj/web-search-local) from ClawHub.
Skill page: https://clawhub.ai/lnblxj/web-search-local
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install web-search-local

ClawHub CLI

Package manager switcher

npx clawhub@latest install web-search-local
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description match the implementation: the code scrapes search engines, supports engine selection/auto-failover, caching, delays and proxy options. There are no extraneous credential or system access requests that don't belong to a web-search tool.
Instruction Scope
SKILL.md and code consistently instruct the agent to run the provided CLI script and document its flags. Instructions and code operate on network requests and a local cache directory (~/.cache/web-search-local) only; they do not attempt to read unrelated system files or external secrets.
Install Mechanism
There is no declared install spec in the registry (instruction-only), but the README and docs state runtime dependencies (requests, beautifulsoup4). That mismatch (no automated install step) is not malicious but worth noting: the agent or user must install dependencies manually.
Credentials
The skill requests no environment variables, no credentials, and only writes to its own cache directory in the user's home. Proxy configuration is supported via CLI/API functions; that is appropriate for a network client. No unrelated secrets (AWS keys, tokens) are required.
Persistence & Privilege
Skill does not request persistent 'always' inclusion, does not modify other skills, and only stores its own cache files under ~/.cache/web-search-local. It does not alter system-wide agent configuration or other skill settings.
Assessment
This skill appears coherent for local web scraping search. Before installing: (1) be aware it performs live HTTP(S) requests to public search engines (Bing, DDG, Yandex) and may trigger CAPTCHAs or be blocked in some networks; (2) it stores cached search results under ~/.cache/web-search-local (inspect or clear with --cache-clear if you care about local storage); (3) install the documented Python dependencies (requests, beautifulsoup4) manually or in a virtualenv; (4) verbose/debug logs may include queries and timing info—avoid using with sensitive queries if you don't want them logged; (5) if you require stronger isolation, run the script in a sandboxed container or VM. If you want greater assurance, review the full scripts/search.py file for any additional network calls or behavior not covered in the docs (the provided file looks consistent with the documented functionality).

Like a lobster shell, security has layers — review code before you run it.

latestvk97bc79a8f0gprtt8fncwwtkzn837wqw
222downloads
0stars
2versions
Updated 1mo ago
v0.1.1
MIT-0

web-search-local

Local web search skill without API Key, supports multi-engine auto-switching and built-in cache.

Triggers

Use this skill when the user requests:

  • "搜索/查一下/搜一下 + 关键词" (Search/Check/Look up + keywords)
  • "帮我找/帮我搜 + 信息" (Help me find/search + information)
  • Need to get real-time web information
  • "最近/最新 + 某话题" (Recent/Latest + topic)
  • "查资料/网上查" (Look up materials/Search online)

Usage

Basic Search

python3 scripts/search.py -q "keywords"

Specify Engine

python3 scripts/search.py -q "keywords" -e bing      # Bing (default)
python3 scripts/search.py -q "keywords" -e auto       # Auto failover
python3 scripts/search.py -q "keywords" -e webfetch  # urllib fallback

Common Options

python3 scripts/search.py -q "keywords" -l 5          # Limit result count
python3 scripts/search.py -q "keywords" --fast       # Fast mode, skip cookies
python3 scripts/search.py -q "keywords" --no-cache   # Skip cache
python3 scripts/search.py -q "keywords" -f text      # Text format output
python3 scripts/search.py -q "keywords" -o file.json  # Output to file
python3 scripts/search.py -q "keywords" -v            # Verbose logging

Engine Selection

EngineDescription
bingDefault primary, supports RSS and HTML dual mode
autoAuto failover, Bing → Yandex → DDG → WebFetch
webfetchurllib standard library, no requests package needed

Default is bing, use auto on failure.

Cache Mechanism

  • Location: ~/.cache/web-search-local/
  • Expiration: 1 hour
  • Cache hit returns in sub-second

Cache Management

python3 scripts/search.py --cache-stats   # View cache statistics
python3 scripts/search.py --cache-clear   # Clear cache

Output Format

JSON (default)

python3 scripts/search.py -q "keywords" -f json
{
  "query": "search keywords",
  "engine": "bing",
  "count": 3,
  "results": [
    {"title": "page title", "url": "https://...", "snippet": "summary description"}
  ],
  "elapsed_seconds": 0.58
}

Text

python3 scripts/search.py -q "keywords" -f text
Search: python programming
Engine: bing
Results: 2
============================================================

1. Python.org - Official Site
   https://python.org
   The official home of Python

Notes

  • Chinese search optimized for cn.bing.com
  • Each search has 2-5 second delay (anti-crawler policy)
  • No delay on cache hit
  • Does not support content requiring login

Detailed Documentation

Comments

Loading comments...