Local Llm Router

PassAudited by ClawScan on May 1, 2026.

Overview

This is a coherent local LLM router guide with expected risks from installing an external package, running local router/node services, exposing local management APIs, and storing local operational logs.

This skill appears safe to review as a local LLM routing setup, but install ollama-herd only from a trusted source, run the router and node agents deliberately, keep the localhost management API private, and review local logging if your LLM usage metadata is sensitive.

Findings (4)

Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.

What this means

Installing the package will run third-party local software on the user's machine.

Why it was flagged

The skill directs the user to install an external package, while the provided artifact set contains only instructions and no package code. This is expected for the router purpose but is still a provenance point users should verify.

Skill content
pip install ollama-herd           # install the local LLM router
Recommendation

Verify the PyPI package and GitHub repository, consider pinning a trusted version, and install in a controlled Python environment.

What this means

The router and node agent may continue handling local LLM requests while running.

Why it was flagged

The setup instructions launch local executable commands. This is disclosed and purpose-aligned, but it means the skill depends on running local services.

Skill content
herd                              # launch the local LLM router (scores and routes)
herd-node                         # launch a local LLM node agent on each device
Recommendation

Run these commands only when you intend to operate the router, and know how to stop the processes or uninstall the package.

What this means

Running management API commands can change how local model routing and downloads behave.

Why it was flagged

The documented local API includes POST operations that can change router settings. This is normal for a management interface but should remain user-directed.

Skill content
curl -s -X POST http://localhost:11435/dashboard/api/settings \
  -H "Content-Type: application/json" \
  -d '{"auto_pull": false}'
Recommendation

Review POST commands before running them, especially model-pull or settings changes, and keep the management endpoint limited to trusted local use.

What this means

Local logs and traces may expose which models were used, routing decisions, timing, and token counts.

Why it was flagged

The skill declares local persistent database and log files for latency and router activity. This is expected for routing and monitoring but may reveal local usage patterns.

Skill content
configPaths":["~/.fleet-manager/latency.db","~/.fleet-manager/logs/herd.jsonl"]
Recommendation

Check the router's logging configuration and retention behavior if model usage patterns or request metadata are sensitive.