Linux Ollama

v1.0.0

Linux Ollama — run Ollama on Linux with fleet routing across multiple Linux machines. Linux Ollama setup for Llama, Qwen, DeepSeek, Phi, Mistral. Route Ollam...

0· 85·2 current·2 all-time
byTwin Geeks@twinsgeeks
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Pending
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description describe multi-machine Ollama routing and the SKILL.md contains step-by-step instructions to install Ollama, install a 'herd' Python package, run router/node processes, configure systemd, firewall, and monitor files under ~/.fleet-manager. Declared anyBins (curl|wget) and optional bins (python3, pip, systemctl, nvidia-smi) are referenced in the instructions and the provided configPaths (~/.fleet-manager/...) are used in examples — coherent with the stated purpose.
Instruction Scope
The instructions stay within the stated purpose: installing Ollama, installing the herd package, starting router/node processes, systemd integration, firewall rules, monitoring endpoints and logs, and examples for API usage. The SKILL.md references only the declared config paths and common tooling; it does not instruct reading unrelated system credentials or other users' data. One minor note: the examples show OpenAI client usage against a local base_url with api_key set to 'not-needed' (demonstrative), which is not a request for secrets.
Install Mechanism
This is instruction-only (no installer in the registry). The install steps ask users to run curl -fsSL https://ollama.ai/install.sh | sh and pip install ollama-herd. Both are coherent for installing a system binary and a Python package, but piping a remote script into sh and installing PyPI packages execute remote code and should be reviewed/audited before running. No arbitrary archive downloads from unknown hosts are present in the SKILL.md.
Credentials
The skill does not require any credentials or environment variables in the registry metadata. It suggests non-sensitive OL LAMA tuning env vars (OLLAMA_*), which are reasonable for runtime configuration. No secrets (API keys, tokens, passwords) are requested or required by the skill itself. The example use of an OpenAI client is illustrative and does not request a real key.
Persistence & Privilege
always is false and the skill does not request elevated platform privileges. The instructions suggest enabling systemd services for the router/node so the process runs at boot — this is an appropriate behavior for a network service. The skill does not direct modifying other skills or system-wide agent settings beyond installing and enabling its own services.
Assessment
This skill appears internally consistent with its purpose, but exercise normal caution before following the install steps: 1) Inspect the remote install script (https://ollama.ai/install.sh) before piping into sh; prefer to download and review or follow distro-packaged installers if available. 2) Review the 'ollama-herd' PyPI package source (or its GitHub repo) before pip installing; consider installing in a virtualenv or sandbox. 3) The router listens on TCP 11435 — avoid exposing this port to the public internet without authentication; lock it to your LAN or put it behind an authenticated reverse proxy. 4) Confirm how the herd handles authentication/authorization between nodes (the SKILL.md shows mDNS and direct IP but does not document auth), and require secure networking (VPN/TLS) between nodes if used across untrusted networks. 5) Monitor ~/.fleet-manager logs and ensure retained sensitive data is handled appropriately. If you need higher assurance, test the setup in an isolated VM or lab network and audit the GitHub repo (https://github.com/geeks-accelerator/ollama-herd) and the install scripts before production use.

Like a lobster shell, security has layers — review code before you run it.

latestvk97chz929a1xe7c9d6c3z1w0e5844rr9

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Runtime requirements

penguin Clawdis
OSLinux
Any bincurl, wget

Comments