Relay To Agent
Relay messages to AI agents on any OpenAI-compatible API. Supports multi-turn conversations with session management. List agents, send messages, reset sessions.
MIT-0 · Free to use, modify, and redistribute. No attribution required.
⭐ 6 · 2.5k · 13 current installs · 13 all-time installs
byEric Santos@ericsantos
MIT-0
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description, required binary (node), primary credential (RELAY_API_KEY), and the included script all align: the tool reads agents.json, calls an OpenAI-compatible base URL, and returns replies. No unrelated credentials or binaries are requested.
Instruction Scope
Runtime instructions and the script perform expected actions: they read agents.json (or RELAY_CONFIG), use RELAY_API_KEY and optional RELAY_BASE_URL, send chat completions to the configured endpoint, and persist up to 50 messages per agent+session under ~/.cache/relay-to-agent/sessions. Note: user content is written to disk (session files) and transmitted to the configured remote endpoint — verify that endpoint is trusted for sensitive content.
Install Mechanism
This is instruction-only (no install spec). A node script and package.json/package-lock are included; there is no automated download-from-URL or other high-risk installer. Running it may require doing an npm install locally; the declared dependency (openai-fetch) is reasonable for the stated purpose.
Credentials
Only RELAY_API_KEY is required as the primary secret; optional RELAY_BASE_URL and RELAY_CONFIG are documented. The requested env vars are proportional and justified by the skill's function.
Persistence & Privilege
The skill is not always-enabled and does not modify other skills or system-wide settings. It stores session data under the user's home directory (~/.cache/relay-to-agent/sessions), which is expected for multi-turn state; this is a moderate local persistence but scoped to the user's home.
Assessment
This skill appears to do what it says: it sends messages to an OpenAI-compatible API and stores local session history. Before installing or running it: 1) Verify agents.json (or the RELAY_CONFIG path) to ensure baseUrl points to a trusted provider — your RELAY_API_KEY will be sent to that base URL. 2) Be aware that message contents are written to ~/.cache/relay-to-agent/sessions (up to 50 messages) — remove these files if you need to protect sensitive data. 3) If you run npm install to use the supplied package.json, review dependencies (package-lock.json shows openai-fetch and ky) and ensure node >= 18. 4) Avoid using high-privilege or production API keys with untrusted or unknown endpoints; prefer a scoped/test key. 5) If you want stronger guarantees, inspect agents.json for any unexpected agent entries and set RELAY_BASE_URL explicitly rather than relying on a bundled config. Overall there are no incoherent or suspicious requirements, but treat the configured endpoint and stored session data as the primary privacy risk.Like a lobster shell, security has layers — review code before you run it.
Current versionv0.0.1
Download ziplatest
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
Runtime requirements
🤖 Clawdis
Binsnode
Primary envRELAY_API_KEY
SKILL.md
Relay To Agent
Send messages to AI agents on any OpenAI-compatible endpoint. Works with Connect Chat, OpenRouter, LiteLLM, vLLM, Ollama, and any service implementing the Chat Completions API.
List available agents
node {baseDir}/scripts/relay.mjs --list
Send a message to an agent
node {baseDir}/scripts/relay.mjs --agent linkedin-alchemist "Transform this article into a LinkedIn post"
Multi-turn conversation
# First message
node {baseDir}/scripts/relay.mjs --agent connect-flow-ai "Analyze our latest campaign"
# Follow-up (same session, agent remembers context)
node {baseDir}/scripts/relay.mjs --agent connect-flow-ai "Compare with last month"
Reset session
node {baseDir}/scripts/relay.mjs --agent linkedin-alchemist --reset "Start fresh with this article..."
Options
| Flag | Description | Default |
|---|---|---|
--agent ID | Target agent identifier | (required) |
--reset | Reset conversation before sending | off |
--list | List available agents | — |
--session ID | Custom session identifier | default |
--json | Raw JSON output | off |
Configuration
agents.json
Configure agents and endpoint in {baseDir}/agents.json:
{
"baseUrl": "https://api.example.com/v1",
"agents": [
{
"id": "my-agent",
"name": "My Agent",
"description": "What this agent does",
"model": "model-id-on-the-api"
}
]
}
Environment variables
export RELAY_API_KEY="sk-..." # API key (required)
export RELAY_BASE_URL="https://..." # Override base URL from config
export RELAY_CONFIG="/path/to/agents.json" # Custom config path
Compatible Services
- Connect Chat —
api.connectchat.ai/api - OpenRouter —
openrouter.ai/api/v1 - LiteLLM —
localhost:4000/v1 - vLLM —
localhost:8000/v1 - Ollama —
localhost:11434/v1 - Any OpenAI-compatible API
Session Management
Sessions are stored locally at ~/.cache/relay-to-agent/sessions/. Each agent+session combination keeps up to 50 messages. Use --session for parallel conversations with the same agent.
Files
6 totalSelect a file
Select a file to preview.
Comments
Loading comments…
