Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

onionclaw

v2.1.13

Search the Tor dark web, fetch .onion hidden-service pages, rotate Tor identities, and run structured multi-step OSINT investigations. Use when the user asks...

0· 216·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for jacobjandon/onionclaw.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "onionclaw" (jacobjandon/onionclaw) from ClawHub.
Skill page: https://clawhub.ai/jacobjandon/onionclaw
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required binaries: python3, pip3, tor
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install onionclaw

ClawHub CLI

Package manager switcher

npx clawhub@latest install onionclaw
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
Name/description align with required binaries (python3, pip3, tor) and Python libs (requests[socks], stem, BeautifulSoup). All requested capabilities (search, fetch, rotate Tor identity, OSINT pipeline) are coherent with these requirements.
!
Instruction Scope
SKILL.md instructs running pip installs, a setup.py 'interactive first‑run wizard' that updates .env and torrc, and restarting system Tor (systemctl / brew services). setup.py is expected to modify system config files (e.g., /etc/tor/torrc) and create DataDirectory paths. Those are legitimate for a Tor tool but grant the skill potential to change system configuration and require elevated privileges; the document does not publish or show the contents of setup.py, renew.py, or other scripts, so you must inspect them before running.
Install Mechanism
Instruction-only skill (no install spec). It tells you to pip3 install specific PyPI packages — expected for Python tooling. Risk: pip installs are global by default in the examples (no virtualenv recommendation) and could overwrite system packages; no downloads from arbitrary URLs were specified.
!
Credentials
Registry metadata declares no required env vars, but SKILL.md references a .env with an LLM key (optional) and a TOR_DATA_DIR used by renew.py; this mismatch is a minor inconsistency. The tool also asks you to enable ControlPort/ CookieAuthentication in torrc (required for circuit control) which is appropriate but increases capability to control Tor circuits. No unrelated third‑party credentials are requested.
Persistence & Privilege
always:false (good). The skill documents a daemon/--daemon-poll mode and recurring watch/alert jobs; those would create long‑running background activity if enabled. Autonomous invocation (disable-model-invocation:false) is the platform default and not by itself suspicious.
What to consider before installing
This skill is coherent with its stated Tor/OSINT purpose, but exercise caution before running anything it recommends. Specifically: (1) Inspect setup.py, renew.py, and any scripts referenced — they will modify tor configuration and may require root. (2) Run Python package installs inside a virtualenv or container to avoid polluting system Python. (3) Keep any LLM key or secrets out of the .env until you trust the code; the README mentions TOR_DATA_DIR and an LLM key but the registry metadata does not declare them. (4) If you want to try it, run it in an isolated VM/container and review the scripts that modify /etc/tor/torrc or create system services. (5) Consider legal and ethical implications of dark‑web scanning in your jurisdiction and organization.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

🧅 Clawdis
OSmacOS · Linux
Binspython3, pip3, tor
latestvk9756xcj3qmx7jxngzxz2778jn832a21
216downloads
0stars
1versions
Updated 21h ago
v2.1.13
MIT-0
macOS, Linux

OnionClaw — Tor / Dark Web OSINT

v2.1.13 · by JacobJandon · MIT-0 License github.com/JacobJandon/OnionClaw

OnionClaw routes all requests through the Tor network. It queries 12 verified dark web search engines simultaneously, fetches .onion hidden-service pages, rotates Tor circuits, schedules recurring watch/alert jobs, and produces structured OSINT reports (Markdown, JSON, STIX, MISP, CSV) using the Robin investigation pipeline.


Setup (run once after install)

# 1. Install Python dependencies
pip3 install requests[socks] beautifulsoup4 python-dotenv stem

# 2. Interactive first-run wizard (sets up .env, torrc, and Tor in one step)
python3 {baseDir}/setup.py

# — OR — manual setup:
cp {baseDir}/.env.example {baseDir}/.env
# Edit {baseDir}/.env — add your LLM key (search + fetch work without one)

Start Tor (required before any command):

# Linux:
sudo apt install tor && sudo systemctl start tor

# macOS:
brew install tor && brew services start tor

# Custom (no root needed — setup.py can do this automatically):
tor -f /tmp/sicry_tor.conf &
# torrc: SocksPort 9050 / ControlPort 9051 / CookieAuthentication 1 / DataDirectory /tmp/tor_data

Enable circuit rotation (required for renew.py and --daemon-poll):

Add to /etc/tor/torrc:
  ControlPort 9051
  CookieAuthentication 1
Then: systemctl restart tor
setup.py does this automatically.

Commands

Check Tor is running

Always run this first before any dark web operation.

python3 {baseDir}/check_tor.py

Returns your exit IP and tor_active: true/false. If false, tell the user to start Tor before continuing.


Rotate Tor identity

Get a fresh exit node and a new three-hop circuit. Use between sessions or whenever a new IP is needed.

python3 {baseDir}/renew.py

Returns success: true/false. If false, ensure ControlPort 9051 is enabled and TOR_DATA_DIR is set in .env (or use setup.py).


Check which search engines are alive

Ping all 12 engines via Tor and return latency + up/down for each.

python3 {baseDir}/check_engines.py

Run before a large search session; pass the alive engine names to --engines to skip dead ones and save time.


Search the dark web

Query all 12 dark web engines simultaneously. Returns deduplicated {title, url, engine} results.

# Basic:
python3 {baseDir}/search.py --query "SEARCH_TERM"

# Limit results:
python3 {baseDir}/search.py --query "SEARCH_TERM" --max 30

# Specific engines:
python3 {baseDir}/search.py --query "SEARCH_TERM" --engines Ahmia Tor66 Ahmia-clearnet

Available engines: Ahmia, OnionLand, Amnesia, Torland, Excavator, Onionway, Tor66, OSS, Torgol, TheDeepSearches, DuckDuckGo-Tor, Ahmia-clearnet

Tip: Use short keyword queries (≤5 words). Dark web indexes respond far better to focused keywords than natural-language questions.


Fetch a .onion page

Read the full text of any .onion URL (or clearnet URL) through Tor.

python3 {baseDir}/fetch.py --url "http://SOME.onion/path"

Returns: {title, text (first 3000 chars), links, status, error}. If status: 0 or error is set, the hidden service is offline — they go down frequently; try a different result from search.py.


OSINT analysis

Analyse raw dark web text with an LLM and produce a structured sectioned report.

# From a string:
python3 {baseDir}/ask.py --query "QUERY" --mode MODE --content "RAW_TEXT"

# From a file:
python3 {baseDir}/ask.py --query "QUERY" --mode MODE --file /path/to/content.txt

# From stdin (pipe):
echo "CONTENT" | python3 {baseDir}/ask.py --query "QUERY" --mode MODE

Analysis modes:

ModeUse for
threat_intelGeneral OSINT (default) — artifacts, insights, next steps
ransomwareMalware / C2 / MITRE ATT&CK TTPs, victim orgs, indicators
personal_identityPII / breach exposure, severity, protective actions
corporateLeaked credentials / code / internal docs, IR steps
# With custom focus appended to the prompt:
python3 {baseDir}/ask.py --query "QUERY" --mode threat_intel \
  --custom "Focus on cryptocurrency wallet addresses"

Full OSINT pipeline (single command)

Runs the complete Robin pipeline: refine query → check live engines → search → filter best results → batch scrape → OSINT analysis → save report

python3 {baseDir}/pipeline.py --query "INVESTIGATION_QUERY" --mode MODE

Essential flags:

FlagDefaultDescription
--query TEXTrequiredInvestigation topic (natural language OK — refined automatically)
--mode MODEthreat_intelthreat_intel / ransomware / personal_identity / corporate
--max N30Max raw results from search
--scrape N8Pages to batch-fetch (use 0 to skip scraping and get results-only report)
--custom TEXTExtra LLM instructions appended to the mode prompt
--out FILESave report to file (exits 1 on permission error)
--format FMTmdOutput format: md / json / csv / stix / misp
--no-llmSkip all LLM steps — dump raw results / entity extraction only
--confidenceShow BM25 confidence score per result
--engines NAME…Restrict to specific engines (skip dead ones)
--no-cacheBypass query/page cache for this run
--clear-cacheFlush the result cache, then run
--resume JOB_IDResume a checkpointed pipeline run by job ID
--interactiveAfter the report, open a follow-up REPL for drill-down
--output-dir DIRWrite <job_id>.<ext> into DIR (batch pipeline friendly)
--modesList all modes and their engine routing, then exit
--engine-statsPrint per-engine reliability / latency table, then exit
--check-updateCheck for a newer OnionClaw release and exit
--versionPrint version and exit

MISP-specific flags:

FlagDefaultDescription
--misp-threat-level N2MISP threat level 1–4 (1=high, 4=undefined)
--misp-distribution N0MISP distribution (0=your org, 1=connected, 2=all, 3=inherited)

Watch / alert flags:

FlagDescription
--watchRegister this query as a recurring watch job and exit
--interval HOURSRe-run interval in hours for --watch (default 6)
--watch-checkRun all due watch jobs now and print alerts
--watch-check --output-dir DIRSame but write each job's JSON to DIR (exits 1 on write error)
--watch-listList all active watch jobs
--watch-disable JOB_IDDisable a watch job by ID
--watch-clear-allDisable ALL active watch jobs at once
--watch-daemon(deprecated alias) Run as a blocking daemon loop
--daemon-poll SECONDSRun --watch-check every N seconds in a daemon loop

Daemon mode (continuous monitoring)

Keep OnionClaw running and poll watch jobs at a fixed interval:

python3 {baseDir}/pipeline.py --daemon-poll 3600   # check every hour

Scheduling watch jobs

Register a query as a recurring alert:

# Register (runs every 6 hours by default):
python3 {baseDir}/pipeline.py --query "ransomware hospital 2026" --watch --interval 6

# List all active jobs:
python3 {baseDir}/pipeline.py --watch-list

# Check due jobs now and write JSON files for each:
python3 {baseDir}/pipeline.py --watch-check --output-dir /tmp/alerts/

# Disable one job:
python3 {baseDir}/pipeline.py --watch-disable <JOB_ID>

# Clear all:
python3 {baseDir}/pipeline.py --watch-clear-all

Typical investigation flows

"Search the dark web for X"

  1. python3 {baseDir}/check_tor.py — verify connected
  2. python3 {baseDir}/search.py --query "X" — search all 12 engines
  3. python3 {baseDir}/fetch.py --url "URL" — read top 2–3 results
  4. python3 {baseDir}/ask.py --mode threat_intel --query "X" --content "..." — generate report

"Has company.com appeared in dark web leaks?"

  1. python3 {baseDir}/check_tor.py
  2. python3 {baseDir}/pipeline.py --query "company.com credentials leak" --mode corporate
  3. Present the structured report

"Investigate ransomware group X"

  1. python3 {baseDir}/check_tor.py
  2. python3 {baseDir}/pipeline.py --query "GROUP_NAME ransomware" --mode ransomware

"Write a STIX bundle for this investigation"

python3 {baseDir}/pipeline.py \
  --query "QUERY" --mode threat_intel \
  --format stix --out bundle.json

"Fetch this .onion URL"

  1. python3 {baseDir}/check_tor.py
  2. python3 {baseDir}/fetch.py --url "URL"
  3. Show the user the title + text content

"Monitor for new leaks mentioning acme.com, alert me daily"

python3 {baseDir}/pipeline.py \
  --query "acme.com leak credentials" --watch --interval 24
# Later, in a cron job or daemon:
python3 {baseDir}/pipeline.py --watch-check --output-dir /tmp/acme-alerts/

Output formats

FormatFlagUse for
Markdown--format md (default)Human-readable reports, --out report.md
JSON--format jsonStructured machine-readable, automation
CSV--format csvSpreadsheet import, result lists
STIX 2.1--format stixThreat-intel platforms (MISP, OpenCTI, Splunk ES)
MISP--format mispDirect MISP event import

Important notes

  • All traffic routes through Tor — tell the user this when relevant.
  • .onion hidden services go offline frequently. status: 0 means the site is temporarily unreachable — try a different result from search.py.
  • Dark web search indexes go down often — run check_engines.py first and pass only alive engine names with --engines.
  • LLM tools (ask.py, pipeline steps 3/5/7) require an API key in {baseDir}/.env. Set LLM_PROVIDER=ollama for fully local inference with no key. search.py, fetch.py, check_tor.py, renew.py, and check_engines.py work with no key at all.
  • --scrape 0 skips page fetching. The pipeline still runs step 7 (LLM analysis on search-result metadata only) and writes --out / --output-dir normally. A WARN: --scrape 0 notice is printed to stderr.
  • Use responsibly and lawfully — OSINT, security research, and threat intelligence only.

Maintenance

Update the bundled sicry.py engine

OnionClaw bundles sicry.py from the upstream SICRY™ repo. After a new SICRY™ release, sync the bundled copy:

# Pull latest:
python3 {baseDir}/sync_sicry.py

# Pull a specific release tag:
python3 {baseDir}/sync_sicry.py --tag v2.1.13

# Preview without writing:
python3 {baseDir}/sync_sicry.py --dry-run

Checking for OnionClaw updates

OnionClaw checks the GitHub Releases API (published releases only — not plain git tags) for newer versions. A one-line notice is printed automatically at pipeline startup when an update is available.

# On-demand update check:
python3 {baseDir}/pipeline.py --check-update

# Programmatic:
import sicry
r = sicry.check_update()
if not r["up_to_date"]:
    print(f"Update: {r['current']} → {r['latest']}  {r['url']}")
# Upgrade:
git -C {baseDir} pull
python3 {baseDir}/sync_sicry.py

Comments

Loading comments...