AI守门人
ReviewAudited by ClawScan on May 10, 2026.
Overview
The skill is mostly a disclosed local LLM proxy, but its start/stop script can forcibly kill unrelated processes using the configured port.
Use this only if you are comfortable running a local background LLM proxy. Before starting or restarting, confirm port 18888 is not used by another service, use scoped provider API keys, review the filtering rules and logs, and stop the proxy when finished.
Findings (6)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
Starting, stopping, or restarting the skill could terminate an unrelated local service if it is using the same port.
The default control flow can forcibly terminate every process listening on the configured port, without verifying that it is the llm-proxy process.
kill_by_port "$PROXY_PORT" ... pids=$(lsof -ti ":$port" ...); ... kill -9 "$pid"
Check the port and PID before running start/stop/restart. The maintainer should verify process ownership, ask before killing, and prefer graceful termination.
Model output may be changed or interrupted by the proxy’s safety rules.
The proxy can inject warning chunks into streaming responses and block responses that match critical content rules. This is aligned with the stated safety-filtering purpose.
warning_chunk = inject_warning_chunk(quick_alerts) ... if critical: ... return json.dumps(error_response, ensure_ascii=False).encode('utf-8'), TrueReview and test the filtering rules before relying on the proxy for production or sensitive workflows.
Provider API keys pass through the local proxy when you use it.
The proxy is intended to receive and forward provider authorization headers. This is expected for an LLM API proxy, and the artifacts do not show unrelated credential collection.
-H "Authorization: Bearer $OPENAI_API_KEY"
Use provider keys with appropriate scope and billing controls, and only send requests from trusted local clients.
The skill may fail or behave differently on systems without those tools.
The scripts depend on local tools such as curl, python3, and lsof, while the registry metadata lists no required binaries. This is under-declared but visible and purpose-aligned.
curl -s --max-time 3 "$PROXY_URL" ... python3 -u "$PROXY_SCRIPT" ... pids=$(lsof -ti ":$port"
Ensure python3, curl, and lsof are available, and the maintainer should declare these runtime requirements.
Prompts, responses, and metadata sent through the proxy may leave the machine for the selected provider.
The local gateway routes requests to multiple external provider endpoints. This is the core disclosed purpose, and it is bound to localhost by default.
"listen_host": "127.0.0.1" ... "/openai": { "url": "https://api.openai.com/v1" }Verify the selected provider route before sending sensitive prompts, and understand each provider’s retention and billing policies.
The proxy can continue handling local requests and writing logs after the start command returns.
The proxy starts as a background process and records a PID file. This is disclosed by the start/stop workflow and does not show hidden autostart behavior.
python3 -u "$PROXY_SCRIPT" >> "$LOG_FILE" 2>&1 & ... echo $! > "$PID_FILE"
Stop it when finished and monitor the log directory if using it with sensitive requests.
