Ask Council

PassAudited by ClawScan on May 1, 2026.

Overview

This skill appears purpose-aligned: it sends a user’s question to a local LLM Council backend and returns the synthesized answer, with minor trust and privacy considerations.

This skill is reasonable to install if you trust your local LLM Council backend. Be aware that prompts are submitted to that backend, may be stored as conversations, and the output includes a local link to the full discussion. Avoid sending secrets unless the backend and its model-provider setup are trusted.

Findings (3)

Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.

What this means

Using the skill sends the question to the local council backend and starts a model deliberation run.

Why it was flagged

The script starts a run through the local LLM Council API using the user's question. This is disclosed and central to the skill, but it is still a real backend action that may consume model resources.

Skill content
API_BASE="http://127.0.0.1:8001" ... curl -s -X POST "$API_BASE/api/conversations/$CONVO_ID/runs"
Recommendation

Use it when you intend to submit the question to your LLM Council backend, and avoid accidental invocations for private or costly prompts.

What this means

The safety and privacy of answers also depend on the separate LLM Council backend and its configured model providers.

Why it was flagged

The skill relies on a separately installed/running backend that is outside this package. That dependency is disclosed, but it expands what the user must trust.

Skill content
LLM Council backend must be running: `/install-llm-council`
Recommendation

Install and run this only with a LLM Council backend you trust, and review that backend's configuration separately.

What this means

Questions and model responses may be retained in the local backend, and anyone who sees the link and can reach the service may view the full discussion.

Why it was flagged

Each use creates a backend conversation and prints a link containing the local IP address and conversation short ID.

Skill content
CONVO_RESPONSE=$(curl -s -X POST "$API_BASE/api/conversations" ...); ... echo "View full discussion: http://${LOCAL_IP}:5173/c/${SHORT_ID}"
Recommendation

Do not submit secrets or sensitive personal data unless you understand how your LLM Council backend stores and exposes conversations.