Install
openclaw skills install ask-councilAsk LLM Council a question directly from Telegram/chat — get the chairman's synthesized answer without opening the web UI. Quick, headless access to multi-model consensus.
openclaw skills install ask-councilGet LLM Council's synthesized answer without leaving your chat.
/council Should I invest in Tesla right now?
Returns the Chairman's synthesized answer after all models have debated.
Takes 30-60 seconds — models need time to deliberate.
LLM Council backend must be running:
/install-llm-council
| Mode | Best For | Command |
|---|---|---|
| Quick answer (this skill) | Fast decisions, mobile, casual questions | /council "question" |
| Full discussion (web UI) | Deep research, exploring disagreements, seeing all model responses | /install-llm-council then open browser |
Input:
/council Is Python or Go better for a new microservice?
Output:
Council is deliberating... (this may take 30-60s)
................
═══════════════════════════════════════════════════════════════
CHAIRMAN'S ANSWER
═══════════════════════════════════════════════════════════════
Based on the council's deliberation, Python is recommended for rapid
prototyping and team velocity, while Go excels for high-throughput
services where performance is critical...
═══════════════════════════════════════════════════════════════
View full discussion: http://10.0.1.184:5173
When user says /council <question> or "ask council":
bash ~/.openclaw/skills/ask-council/ask-council.sh "<question>"
The script handles:
| File | Purpose |
|---|---|
SKILL.md | Documentation |
ask-council.sh | Main script — queries API and returns answer |
_meta.json | Skill metadata |