Local-First LLM
PassAudited by ClawScan on May 1, 2026.
Overview
The skill is coherent and purpose-aligned, but users should remember it is local-first rather than local-only, includes user-run local-provider install commands, and stores a small local usage log.
Install if you are comfortable with a local-first workflow that can still fall back to cloud. For sensitive work, ensure a local provider is running and ask for confirmation before any cloud route. Verify third-party provider install commands, and reset or delete the local savings file if you do not want usage metadata retained.
Findings (3)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
If a local model is not running, the workflow may direct a request to cloud rather than keeping it local.
The routing helper can choose cloud when no local provider is marked available before applying the sensitive-prompt branch. This matches the disclosed fallback behavior, but users should not treat the skill as local-only.
if not args.local_available:
decision = "cloud"
reason = "No local LLM provider is running"
elif sensitive:
decision = "local"For private or regulated prompts, start a local provider first or require explicit confirmation before following a cloud routing decision.
Installing a local provider from a remote script or downloaded executable gives that provider's installer control over the local machine.
The reference setup includes a user-directed remote shell install command for Ollama. It is purpose-aligned for installing a local provider, but remote install scripts should be trusted and verified.
curl -fsSL https://ollama.ai/install.sh | sh
Verify provider sources, prefer official package managers or reviewed installers when possible, and run setup commands only if you trust the provider.
A local history of routing and token-savings metadata remains on disk until reset or deleted.
The skill intentionally persists dashboard data in the user's home directory. The recorded entries shown in code include timestamp, token count, model, route, and cost saved, not prompt text.
DB_PATH = os.path.expanduser("~/.openclaw/local-first-llm/savings.json")Use the documented reset command or remove the savings.json file if you do not want this usage history retained.
