Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Ubuntu Ollama

v1.0.0

Ubuntu Ollama — run Ollama on Ubuntu with fleet routing across multiple Ubuntu machines. Ubuntu Ollama setup with apt, systemd, and NVIDIA CUDA. Route Ollama...

0· 56·1 current·1 all-time
byTwin Geeks@twinsgeeks
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Benign
medium confidence
Purpose & Capability
Name/description (Ubuntu Ollama fleet/router) matches the instructions: installing Ollama, pip-installing a 'herd' package, configuring systemd, opening a port, and using mDNS for discovery. Requested binaries (curl/wget, optional python3/pip, systemctl, apt, nvidia-smi) are appropriate for the described tasks.
Instruction Scope
SKILL.md instructs the agent/operator to run network installs (curl | sh from ollama.ai), pip install a package (ollama-herd), write systemd unit files under /etc/systemd/system, enable services, open firewall ports, and rely on mDNS/HTTP for node discovery. These actions are within the expected scope of deploying a cluster but require root privileges and expose a network service; the doc does not describe authentication for the router endpoint (examples use api_key='not-needed').
!
Install Mechanism
There is no registry install spec (instruction-only), but the runtime instructions direct execution of remote installers (curl -fsSL https://ollama.ai/install.sh | sh) and pip install of 'ollama-herd'. Fetching and executing remote scripts and installing packages from PyPI are standard but high-impact operations — they should be verified before running. The referenced domains (ollama.ai, github.com repo in homepage) are recognizable, but the skill does not include integrity checks or pinned releases.
Credentials
The skill requests no environment variables or credentials and the declared config paths (~/.fleet-manager/...) are consistent with a fleet manager. There are no requests for unrelated secrets or system-wide config beyond systemd and network settings.
Persistence & Privilege
Instructions create and enable system-wide systemd services and modify firewall rules — actions that require root privilege and will persist across reboots. This persistence is reasonable for a long-running router/node service, but it increases blast radius if the installed software is malicious or misconfigured (e.g., an unauthenticated HTTP API exposed on port 11435).
Assessment
This skill appears to do what it says (set up an Ollama fleet), but it requires running network installers and creating system-level services — treat it like any other system service install: 1) Verify upstream sources: confirm the ollama.ai installer and the 'ollama-herd' package are the official/reputable releases referenced by the GitHub repo listed in the skill. 2) Inspect the code/package before pip installing (or install in an isolated environment/container/VM). 3) Avoid piping unknown scripts directly into sh; download and review the installer first. 4) Be cautious about enabling an unauthenticated HTTP endpoint on port 11435 — restrict access with firewall rules, network segmentation, or enable authentication if supported. 5) If you must try it on production hosts, prefer testing in a sandboxed environment and verify service behavior (auth, logging, resource usage) before rolling out to multiple machines.

Like a lobster shell, security has layers — review code before you run it.

latestvk979mj70dbv0ws79hk725az2cx846t3w

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Runtime requirements

penguin Clawdis
OSLinux
Any bincurl, wget

Comments