Ai Model Router V2
Automatically routes requests between local and cloud AI models based on task complexity and privacy, with auto-detection and context tracking.
MIT-0 · Free to use, modify, and redistribute. No attribution required.
⭐ 0 · 39 · 0 current installs · 0 all-time installs
MIT-0
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description (route between local/cloud models, privacy, context) match the provided code. Detector reads Ollama config and returns a built-in cloud registry; router implements scoring, privacy checks, and model selection. File reads/writes (e.g., ~/.ollama/models.json and ~/.model-router/models.json) are expected for this purpose.
Instruction Scope
SKILL.md and usage examples instruct running the included Python CLI/library, which aligns with the code. The router does persist conversation contexts (contexts.json) and config (models.json) under ~/.model-router — this is within scope but important: conversation messages (truncated to 200 chars) are written to disk and could contain sensitive excerpts despite privacy-mode routing to local models.
Install Mechanism
There is no packaged install spec; this is effectively instruction + included source. install.sh is a harmless echo script. No network downloads, no package manager installs, and code is present in the skill bundle (read-only files included).
Credentials
The skill does not require environment variables to run. Some model entries include requires_api_key and api_key_env fields (e.g., ANTHROPIC_API_KEY) which are logical if you choose cloud models, but the skill does not demand them upfront. Detector reads only local files and does not perform network calls.
Persistence & Privilege
The skill creates and writes to ~/.model-router/{models.json,contexts.json} for config and conversation history. It does not request elevated privileges or system-wide changes and is not always: true. Persisting contexts/config in user home is expected but worth noting for privacy.
Assessment
This skill appears to do what it says: route tasks between local and cloud models and keep simple conversation context. Before installing, be aware it will:
- Read ~/.ollama/models.json (if present) to detect local models.
- Create and write ~/.model-router/models.json and ~/.model-router/contexts.json. Conversation messages are truncated to 200 characters but may still include sensitive fragments.
- Offer cloud models that may require you to set API key environment variables (it does not automatically exfiltrate them).
Recommendations:
- If you handle secrets, consider disabling context tracking (don't pass conversation_id or set enable_context=False) or inspect/secure ~/.model-router/contexts.json.
- Review the bundled Python files locally if you want to confirm behavior; they perform only file reads/writes and no network calls or subprocess execution.
- If you plan to use cloud models, only set API keys as environment variables you trust and prefer provider-specific config rather than leaving keys in conversation text.
Overall, the package is coherent and not suspicious, but exercise normal caution with persisted conversation data.Like a lobster shell, security has layers — review code before you run it.
Current versionv1.1.0
Download ziplatest
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
SKILL.md
AI Model Router
Compact, intelligent model routing that just works.
Quick Start
# Install
npx clawhub@latest install ai-model-router
# First run - auto-detects your models
python3 skill/core/router.py "What is Python?"
# List available models
python3 skill/core/router.py --list
How It Works
Your Request → Analyze → Select Model
↓
Simple? → Primary (fast/cheap)
Complex? → Secondary (capable)
Private? → Primary (forced)
Scoring (from model-router-premium)
| Pattern | Points |
|---|---|
| Microservices, architecture | +10 |
| Design, implement, optimize | +5 |
| Explain, analyze, compare | +3 |
| Syntax, example, "what is" | -3 |
Threshold: 5 (simple vs complex)
Features
| Feature | Status |
|---|---|
| Auto-detect local models | ✓ (Ollama, LM Studio) |
| Cloud model registry | ✓ (7 built-in) |
| Privacy detection | ✓ (API keys, passwords) |
| Context tracking | ✓ (conversations) |
| JSON config | ✓ (optional) |
| CLI interface | ✓ |
| Core code size | ~200 lines |
CLI
# Route a task
python3 skill/core/router.py "Design a system"
python3 skill/core/router.py "What is a for loop?"
# Options
--json # JSON output
--force primary # Force primary model
--list # List all models
--status # Show status
Python API
from skill.core.router import RouterCore
router = RouterCore()
result = router.route("Design microservices")
print(result.model_name) # "Claude Opus 4"
print(result.reason) # "complex_task(score=15)"
print(result.confidence) # 0.75
Configuration (Optional)
Create ~/.model-router/models.json:
{
"primary_model": {"id": "ollama:llama3:8b"},
"secondary_model": {"id": "anthropic:claude-opus-4"},
"models": [...]
}
Without config: Auto-detects local + uses cloud registry.
Privacy Protection
Automatically forces primary (local) when sensitive data detected:
- API keys (
sk-...,api_key) - Passwords (
password,passwd) - Tokens (
bearer,secret) - Emails, SSN, credit cards
Files
core/router.py- Core routing engine (~200 lines)modules/detector.py- Auto-detection (optional)modules/context.py- Context tracking (optional)
Inspired By
- model-router-premium: Simple scoring logic, cost-aware routing
- Model Router v1: Full feature set, documentation
This version combines:
- The simplicity of model-router-premium (~200 lines)
- The features of ai-model-router (privacy, auto-detect, context)
Files
7 totalSelect a file
Select a file to preview.
Comments
Loading comments…
