Deepseek Deepseek Coder
v1.0.3DeepSeek DeepSeek-Coder — run DeepSeek-V3, DeepSeek-R1, DeepSeek-Coder across your local fleet. 7-signal scoring routes every request to the best device. Cro...
⭐ 2· 71·1 current·1 all-time
byTwin Geeks@twinsgeeks
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
medium confidencePurpose & Capability
Name/description, examples (curl/OpenAI SDK), and required tools (curl/wget, optional python/pip) all align with running a local fleet router and calling a localhost Ollama-compatible API. The referenced config paths (~/.fleet-manager/latency.db, ~/.fleet-manager/logs/herd.jsonl) are consistent with a fleet manager's state and logs.
Instruction Scope
SKILL.md instructs installing 'ollama-herd' and running local processes (herd, herd-node) and making requests to localhost:11435. It does not instruct reading unrelated system files, exporting secrets, or sending data to external endpoints other than pulling models on demand (which the docs say requires confirmation).
Install Mechanism
This is instruction-only with no install spec; the docs instruct users to 'pip install ollama-herd'. That is coherent but introduces typical supply-chain risk because installing a PyPI package runs third-party code on your machine and may trigger on-demand model downloads. The SKILL itself does not include or pin any binaries.
Credentials
No environment variables or credentials are requested. Example code uses a localhost base_url and sets api_key to 'not-needed'. There are no unexpected credential requests in the SKILL.md or metadata.
Persistence & Privilege
always is false (not force-included). The skill does not request elevated platform privileges or attempt to modify other skills' configuration. Autonomous invocation is allowed by default, which is normal; nothing else indicates persistent privileged presence.
Assessment
This skill is internally consistent with its purpose, but before installing: (1) verify the PyPI package (ollama-herd) and the GitHub repo (review recent commits, maintainer identity, and issues); (2) install in a virtualenv or isolated VM if you are unsure; (3) be prepared for very large model downloads and ensure disk space and bandwidth; (4) confirm any model downloads prompt you before proceeding (the docs claim confirmation is required); (5) check what filesystem paths the herd service writes to (~/.fleet-manager/...) and restrict permissions if needed; (6) consider running the router behind a firewall or localhost-only interface to avoid exposing models to the network. If you want higher confidence, request the actual repository code or a pinned package artifact for review before installing.Like a lobster shell, security has layers — review code before you run it.
apple-siliconvk97224zp6f4ma7p5zs7f1mz7n983ys80code-generationvk976xn4crebhdwyh3ngr8e5wan83xgt6codingvk97224zp6f4ma7p5zs7f1mz7n983ys80deepseekvk97224zp6f4ma7p5zs7f1mz7n983ys80deepseek-codervk97224zp6f4ma7p5zs7f1mz7n983ys80deepseek-r1vk97224zp6f4ma7p5zs7f1mz7n983ys80deepseek-v3vk97224zp6f4ma7p5zs7f1mz7n983ys80fleet-routingvk97224zp6f4ma7p5zs7f1mz7n983ys80latestvk972jpg12519djm4pq9zqqy6tx844d7ylocal-llmvk97224zp6f4ma7p5zs7f1mz7n983ys80ollamavk97224zp6f4ma7p5zs7f1mz7n983ys80reasoningvk97224zp6f4ma7p5zs7f1mz7n983ys80self-hostedvk976xn4crebhdwyh3ngr8e5wan83xgt6
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
Runtime requirements
brain Clawdis
OSmacOS · Linux · Windows
Any bincurl, wget
