Ollama Local

PassAudited by ClawScan on May 1, 2026.

Overview

This skill appears purpose-aligned for Ollama management, but it can change your Ollama model list and send prompts to whatever Ollama server you configure.

This skill is reasonable to install if you want Ollama model management and local inference helpers. Before use, confirm the Ollama host, avoid sending private data to untrusted remote servers, and treat pull/remove/sub-agent commands as actions that can affect your local resources or model inventory.

Findings (4)

Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.

What this means

A mistaken or autonomous remove command could delete an installed model from the selected Ollama server.

Why it was flagged

The helper can delete an Ollama model from the configured server. This is disclosed model-management behavior, but it changes local or remote model state.

Skill content
api_request("/api/delete", method="DELETE", data={"name": model_name})
Recommendation

Use remove commands only for a specific model the user asked to delete, and confirm the intended OLLAMA_HOST before running them.

What this means

Pulling models can download large external artifacts whose behavior and provenance depend on the selected Ollama model source.

Why it was flagged

The pull helper asks Ollama to download a named model. This is central to the skill, but model provenance and trust are delegated to Ollama and the chosen model tag.

Skill content
for chunk in api_stream("/api/pull", {"name": model_name}):
Recommendation

Pull models from trusted sources and prefer known model names/tags for sensitive workflows.

What this means

If OLLAMA_HOST points to a remote server, private prompts or embedded text may leave the local machine, and the examples use plain HTTP.

Why it was flagged

The skill explicitly supports a remote Ollama host. Chat prompts, system prompts, and embedding text are sent to the configured host.

Skill content
export OLLAMA_HOST="http://localhost:11434"
# Or for remote server:
export OLLAMA_HOST="http://192.168.1.100:11434"
Recommendation

Keep OLLAMA_HOST on localhost for private data, or use only trusted remote Ollama servers and appropriate network protections.

NoteHigh Confidence
ASI10: Rogue Agents
What this means

Sub-agents may consume local compute resources and receive task details that the user may not expect to share across multiple agents.

Why it was flagged

The documentation shows spawning one or more local Ollama sub-agents. This is disclosed and purpose-aligned, but it delegates task context to additional agents.

Skill content
Spawn local model sub-agents with `sessions_spawn` ... for a in agents:
    sessions_spawn(task=a["task"], model=a["model"], label=a["label"])
Recommendation

Spawn sub-agents only when useful for the user’s task, limit the number of agents, and avoid passing secrets unless necessary.