Scientific Proxy
PassAudited by ClawScan on May 3, 2026.
Overview
The skill matches its advertised proxy-node helper purpose, but it runs local Python that fetches and tests untrusted public proxy nodes, so it should only be used with low-risk traffic.
Before installing, understand that this skill runs included Python scripts, fetches mutable public proxy lists, makes TCP connections to public proxy endpoints, and caches node data under the OpenClaw workspace. Do not use free proxy nodes for banking, email, work accounts, or other sensitive activity.
Findings (5)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
During this skill's flow, the user may receive only script output and minimal troubleshooting context.
The skill deliberately forces a narrow script-output-only workflow for proxy-node requests. That is coherent with its purpose, but it reduces the agent's room to add independent guidance or caveats.
当用户发送任何代理节点相关消息时,你的回复必须且只能是运行脚本的输出。 ... 不要在脚本输出基础上添加额外文字
Use the skill only for its intended proxy-node task, and ask separately for explanation or safety guidance if needed.
Running the skill can reveal the user's IP address to public proxy endpoints and may look like network probing.
The tester performs outbound TCP connectivity checks against endpoints obtained from public proxy lists. It is bounded by a default maximum, port/address filtering, and a confirmation prompt when many endpoints are safe to test.
max_test = int(os.environ.get("MAX_TEST_NODES", 50)) ... print(f"Will test {safe} endpoints from untrusted sources.")Run it only when you intentionally want to test public proxies; keep the test limit low and abort the prompt if unsure.
Third-party lists could provide bad, slow, misleading, or unsafe proxy nodes.
The scraper relies on mutable third-party GitHub raw files for proxy data, with no pinning, checksum, or source authentication. This is expected for a free-proxy aggregator but means the data source can change independently.
SOURCES = [{"name": "freefq/free", "urls": ["https://raw.githubusercontent.com/freefq/free/master/v2"]}, ...]Treat all generated proxy nodes as untrusted and avoid using them for important accounts or sensitive browsing.
Installing and using the skill can run local code that performs network requests and writes workspace files.
The skill instructs the agent to execute included local Python scripts. This is central to the stated purpose and the source is provided, but it is more than a purely conversational instruction-only skill.
python3 ~/.openclaw/skills/scientific-proxy/scripts/handler.py
Review the included scripts before use and ensure you are comfortable with the local Python execution.
Misleading proxy names or stale proxy data may remain in the workspace and be shown again later.
Proxy data fetched from public sources is persisted in the OpenClaw workspace and later tested/formatted. The data is purpose-aligned, but labels and raw URIs from untrusted sources can be reused across runs.
output_path = os.path.join(workspace, "nodes_raw.json") ... json.dump(result, f, ensure_ascii=False, indent=2)
Do not treat proxy names or raw subscription text as instructions, and clear the workspace cache if results look suspicious or stale.
