Ask Council
PassAudited by VirusTotal on May 12, 2026.
Overview
Type: OpenClaw Skill Name: ask-council Version: 1.0.4 The skill bundle is benign. The `ask-council.sh` script interacts exclusively with a local API endpoint (`http://127.0.0.1:8001`). User input is safely escaped using `python3 -c 'import json, sys; print(json.dumps(sys.stdin.read().strip()))'` before being embedded in JSON payloads for `curl` requests, preventing shell and JSON injection. The use of `hostname -I` is for displaying a local IP address in a URL for user convenience, not for malicious network activity. There is no evidence of data exfiltration, malicious execution, persistence mechanisms, or prompt injection attempts against the agent within the skill's instructions or code.
Findings (0)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
Using the skill sends the question to the local council backend and starts a model deliberation run.
The script starts a run through the local LLM Council API using the user's question. This is disclosed and central to the skill, but it is still a real backend action that may consume model resources.
API_BASE="http://127.0.0.1:8001" ... curl -s -X POST "$API_BASE/api/conversations/$CONVO_ID/runs"
Use it when you intend to submit the question to your LLM Council backend, and avoid accidental invocations for private or costly prompts.
The safety and privacy of answers also depend on the separate LLM Council backend and its configured model providers.
The skill relies on a separately installed/running backend that is outside this package. That dependency is disclosed, but it expands what the user must trust.
LLM Council backend must be running: `/install-llm-council`
Install and run this only with a LLM Council backend you trust, and review that backend's configuration separately.
Questions and model responses may be retained in the local backend, and anyone who sees the link and can reach the service may view the full discussion.
Each use creates a backend conversation and prints a link containing the local IP address and conversation short ID.
CONVO_RESPONSE=$(curl -s -X POST "$API_BASE/api/conversations" ...); ... echo "View full discussion: http://${LOCAL_IP}:5173/c/${SHORT_ID}"Do not submit secrets or sensitive personal data unless you understand how your LLM Council backend stores and exposes conversations.
