Private Deep Search
v1.0.0Performs private, multi-round deep web searches excluding Google/Bing, synthesizes results with citations, and does not retain user data or logs.
⭐ 1· 1.7k·0 current·0 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name and artifacts match: this is a self-hosted SearXNG-based search + a Python 'deep_research' tool that iteratively queries SearXNG and scrapes pages. Required software (Docker, Python, aiohttp, BeautifulSoup) is appropriate for the stated task; there are no unrelated credentials, binaries, or config paths requested.
Instruction Scope
Runtime instructions (setup.sh, docker-compose, SKILL.md) remain within the expected scope: start a local SearXNG container, copy skills into ~/.clawdbot/skills, and run deep_research.py which fetches and scrapes web pages returned by SearXNG. One operational note: the Python tool will fetch arbitrary result URLs (intended), so it will make outbound HTTP requests to the internet (and could reach any host that appears in search results). This is expected for a scraper but is a network-privacy consideration rather than incoherence.
Install Mechanism
There is no complex/install spec; setup uses docker-compose to pull searxng/searxng:latest (official Docker Hub image) and a local setup script to generate a secret key. Pulling an image from Docker Hub is normal here, but using the mutable 'latest' tag has supply-chain implications (image contents can change). Python deps are standard and installed via pip when requested.
Credentials
The skill does not request environment variables, credentials, or system config paths. It asks for reasonable local dependencies (Docker, Python and a couple of Python packages) which match the described functionality.
Persistence & Privilege
Skill is not marked always:true and does not require modifying other skills or system-wide agent settings. setup.sh updates only the included settings.yml and starts the local container; the container mounts only the local ./searxng config directory. No excessive privileges are requested in the manifest.
Assessment
This package appears coherent and implements a local SearXNG + scraper. Before installing: 1) Inspect the Docker image (searxng/searxng:latest) or pin a specific release tag to avoid surprise image changes. 2) Verify docker/searxng/settings.yml after setup (setup.sh replaces the placeholder secret for you). 3) Be aware deep_research.py will fetch and parse URLs returned by searches — consider running the container and scraper in an isolated network namespace or behind a VPN if you want to limit outbound reach (prevents accidental access to internal hosts). 4) Review and, if necessary, adjust IGNORED_DOMAINS and request timeouts/rate limits to avoid aggressive scraping. 5) If you require extra assurance, pull and inspect the searxng Docker image locally or build from source and run with least-privilege container settings (no host network unless you intend VPN=host). Overall: functionally consistent with its stated purpose, but follow standard supply-chain and network-isolation best practices.Like a lobster shell, security has layers — review code before you run it.
latestvk9709vp9h2prhva34sgnds7tch80g1xpprivacyvk9709vp9h2prhva34sgnds7tch80g1xpresearchvk9709vp9h2prhva34sgnds7tch80g1xpscrapingvk9709vp9h2prhva34sgnds7tch80g1xpsearchvk9709vp9h2prhva34sgnds7tch80g1xpsearxngvk9709vp9h2prhva34sgnds7tch80g1xp
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
