Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Sirchmunk

Local file search using sirchmunk API. Use when you need to search for files or content by asking natural language questions.

MIT-0 · Free to use, modify, and redistribute. No attribution required.
1 · 12 · 0 current installs · 0 all-time installs
MIT-0
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
Name, description, SKILL.md, and the included shell wrapper (scripts/sirchmunk_search.sh) are consistent: the skill is a thin client that POSTs a search query to a local sirchmunk server (http://localhost:8584/api/v1/search). The prerequisites (pip install sirchmunk, sirchmunk init, sirchmunk serve) match the stated purpose.
Instruction Scope
The SKILL.md and script themselves only send queries to a localhost endpoint and do not directly read or transmit arbitrary local files. However, the instructions require configuring ~/.sirchmunk/.env with LLM_API_KEY, LLM_BASE_URL, and LLM_MODEL_NAME and running the sirchmunk server — that server is likely responsible for reading configured search paths and contacting the external LLM. Because the skill delegates file access/networking to the server, the actual data flow depends on the server's behavior (not included here).
Install Mechanism
There is no install spec in the registry (instruction-only skill). The SKILL.md suggests installing sirchmunk via pip, which is expected for a Python-based local server; the included script is a simple curl wrapper with no obfuscated or high-risk install actions.
!
Credentials
The registry declares no required environment variables, but SKILL.md instructs the user to create ~/.sirchmunk/.env with LLM_API_KEY, LLM_BASE_URL, and LLM_MODEL_NAME (sensitive credentials). This mismatch (required secrets present in docs but not declared) is a red flag: the local server may transmit file contents to whichever LLM endpoint is configured, so providing those keys can enable exfiltration of searched content to external services.
Persistence & Privilege
The skill is not always-on, does not request special platform privileges, and the provided script does not modify other skills or system configuration. Autonomous invocation is allowed (platform default) but not combined with other high privileges here.
What to consider before installing
This skill is a small local client that talks to a sirchmunk server on localhost; the immediate code is simple and not itself exfiltrating. However, before installing or using it you should: 1) Inspect the sirchmunk server code (the SKILL.md points to a GitHub repo) to confirm how it reads files and where it sends content. 2) Be cautious about providing an LLM_API_KEY/LLM_BASE_URL: if these point to a cloud LLM, your searched file contents may be sent off-host. Prefer running a local, trusted LLM endpoint or omit sensitive directories from SIRCHMUNK_SEARCH_PATHS. 3) Keep the server bound to localhost and firewall it from external access. 4) Verify ~/.sirchmunk/.env contents and don't store global secrets there unless you trust the server implementation. 5) If unsure, run sirchmunk in an isolated environment or container and review network traffic while performing searches. These steps will reduce the risk of accidental data exfiltration.

Like a lobster shell, security has layers — review code before you run it.

Current versionv1.0.0
Download zip
latestvk970xd4ewnx83ydk7enbae489n839vya

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

SKILL.md

Sirchmunk Search

Simple local file search powered by LLM, no embedding-db, no indexing, no ETL.

Tool: sirchmunk_search

Single parameter: query — your search question in natural language.

Example:

~/.openclaw/skills/sirchmunk/scripts/sirchmunk_search.sh "What is the RL agent's reward function?"

Under the hood:

curl -s -X POST "http://localhost:8584/api/v1/search" \
  -H "Content-Type: application/json" \
  -d '{
    "query": "<your query>",
    "paths": ["/path/to/search_paths"],
    "mode": "FAST"
  }'

Notes: The paths parameter requires pre-configuration as SIRCHMUNK_SEARCH_PATHS or inclusion as a search parameter.

Prerequisites

  1. Sirchmunk installed: pip install sirchmunk
  2. Run sirchmunk init
  3. Config: ~/.sirchmunk/.env, LLM_API_KEYLLM_BASE_URL and LLM_MODEL_NAME are required. SIRCHMUNK_SEARCH_PATHS is optional.
  4. Server running: sirchmunk serve

Homepage

https://github.com/modelscope/sirchmunk

Files

2 total
Select a file
Select a file to preview.

Comments

Loading comments…