Local Llm Router
PassAudited by ClawScan on May 1, 2026.
Overview
This is a coherent local LLM router guide with expected risks from installing an external package, running local router/node services, exposing local management APIs, and storing local operational logs.
This skill appears safe to review as a local LLM routing setup, but install ollama-herd only from a trusted source, run the router and node agents deliberately, keep the localhost management API private, and review local logging if your LLM usage metadata is sensitive.
Findings (4)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
Installing the package will run third-party local software on the user's machine.
The skill directs the user to install an external package, while the provided artifact set contains only instructions and no package code. This is expected for the router purpose but is still a provenance point users should verify.
pip install ollama-herd # install the local LLM router
Verify the PyPI package and GitHub repository, consider pinning a trusted version, and install in a controlled Python environment.
The router and node agent may continue handling local LLM requests while running.
The setup instructions launch local executable commands. This is disclosed and purpose-aligned, but it means the skill depends on running local services.
herd # launch the local LLM router (scores and routes) herd-node # launch a local LLM node agent on each device
Run these commands only when you intend to operate the router, and know how to stop the processes or uninstall the package.
Running management API commands can change how local model routing and downloads behave.
The documented local API includes POST operations that can change router settings. This is normal for a management interface but should remain user-directed.
curl -s -X POST http://localhost:11435/dashboard/api/settings \
-H "Content-Type: application/json" \
-d '{"auto_pull": false}'Review POST commands before running them, especially model-pull or settings changes, and keep the management endpoint limited to trusted local use.
Local logs and traces may expose which models were used, routing decisions, timing, and token counts.
The skill declares local persistent database and log files for latency and router activity. This is expected for routing and monitoring but may reveal local usage patterns.
configPaths":["~/.fleet-manager/latency.db","~/.fleet-manager/logs/herd.jsonl"]
Check the router's logging configuration and retention behavior if model usage patterns or request metadata are sensitive.
