LLM Router

v1.0.0

Use AIsa for model routing, provider setup, and Chinese LLM access. Use when: the user needs model configuration, provider guidance, or routing workflows. Su...

0· 88·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for baofeng-tech/llm-router.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "LLM Router" (baofeng-tech/llm-router) from ClawHub.
Skill page: https://clawhub.ai/baofeng-tech/llm-router
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required env vars: AISA_API_KEY
Required binaries: python3
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install llm-router

ClawHub CLI

Package manager switcher

npx clawhub@latest install llm-router
Security Scan
Capability signals
Requires sensitive credentials
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description describe an AIsa-based LLM router; the code and SKILL.md use AISA_API_KEY and python3 only — these are exactly what a gateway client would need.
Instruction Scope
Runtime instructions point to the included CLI script and only require setting AISA_API_KEY; the SKILL.md does not instruct reading unrelated files, scanning host state, or exfiltrating data to unexpected endpoints.
Install Mechanism
No install spec (instruction-only) and a repository-local Python script are provided; there are no downloads, archives, or external installers that would write arbitrary code to disk.
Credentials
Only a single credential (AISA_API_KEY) is required and is consistent with the skill's purpose; no unrelated secrets, config paths, or multiple credential requests appear.
Persistence & Privilege
always is false and the skill does not request or modify agent/system-wide configs; it does not request persistent elevated privileges beyond normal autonomous invocation.
Assessment
This skill is coherent: it bundles a Python CLI that sends requests to https://api.aisa.one and requires only AISA_API_KEY. Before installing, ensure you trust the AIsa service and are comfortable giving that API key network access; monitor usage and billing on the AIsa account and keep the key scoped/revocable if possible. If you need higher assurance, review the included script (scripts/llm_router_client.py) yourself or run it in an isolated environment—the script makes network calls but does not read unrelated local files or request other credentials.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

🤖 Clawdis
Binspython3
EnvAISA_API_KEY
Primary envAISA_API_KEY
latestvk97fqp1n6vx6y6zx937ecqpyah85bp7k
88downloads
0stars
1versions
Updated 5d ago
v1.0.0
MIT-0

LLM Router

Use AIsa for model routing, provider setup, and Chinese LLM access. Use when: the user needs model configuration, provider guidance, or routing workflows. Supports setup and model operations.

When to use

  • The user needs model routing, provider setup, or Chinese LLM access.
  • The user wants one place for provider configuration or model selection.
  • The user wants setup guidance for AIsa-hosted model workflows.

High-Intent Workflows

  • Configure an AIsa provider path.
  • Inspect supported models or routing options.
  • Prepare a runtime for Chinese-model access.

Quick Reference

  • python3 scripts/llm_router_client.py --help

Setup

  • AISA_API_KEY is required for AIsa-backed API access.
  • Use repo-relative scripts/ paths from the shipped package.
  • Prefer explicit CLI auth flags when a script exposes them.

Example Requests

  • Help me configure AIsa for Qwen
  • List the supported routed models
  • Choose a model for Chinese long-form analysis

Guardrails

  • Do not ask for extra credentials beyond the shipped flow.
  • Do not advertise setup paths that the public bundle does not ship.
  • Keep setup instructions aligned with the actual runtime.

Comments

Loading comments...