Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

One API Calling GenAI

Unified interface for all providers and all modalities: use the `genai-calling` skill to operate the published `genai-calling` CLI/SDK across text/image/audi...

MIT-0 · Free to use, modify, and redistribute. No attribution required.
0 · 33 · 0 current installs · 0 all-time installs
MIT-0
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The name/description (a unified multi-provider GenAI CLI/SDK) aligns with the instructions which expect provider API keys and show example commands. However the registry metadata declares no required environment variables or credentials even though SKILL.md clearly documents many sensitive provider variables and config files — this metadata mismatch is noteworthy.
!
Instruction Scope
SKILL.md instructs the agent/user to create and use project-local .env.* files and a user-wide ~/.genai-calling/.env (reads/writes in the user's home), to pass provider API keys (OPENAI_API_KEY, GOOGLE_API_KEY, etc.), to allow downloading arbitrary URLs (configurable, including an option to allow private/loopback URLs), and to optionally run an MCP server with bearer tokens and public base URL. These operations can expose sensitive credentials and permit internal-network access (SSRF-like risk) if misconfigured. The instructions also suggest running `pip install genai-calling`, which pulls code from PyPI (or another source) for execution.
Install Mechanism
The skill bundle itself has no install spec (instruction-only), which is lower surface risk. But SKILL.md recommends installing an external Python package (`genai-calling`) via pip if `uvx` is unavailable. Installing that package executes third-party code; the registry does not provide a homepage or source repo to verify the package before installation, increasing risk.
!
Credentials
The documented runtime clearly requires multiple provider API keys and optional MCP bearer tokens; that is proportionate to a multi-provider CLI. But the registry declared no required environment variables, which is inconsistent and may mislead users. The skill also reads user-wide config (~/.genai-calling/.env), which could aggregate credentials from other projects — a privacy/credential-exposure concern.
Persistence & Privilege
always:false (good). The instructions recommend creating/writing files under the skill directory and ~/.genai-calling/.env (persistent data on the user's machine). The MCP server feature could open a local listening port and expose a public base URL if configured — useful for legitimate use but potentially risky if misconfigured. The skill does not request system-wide skill-modifying privileges in the metadata.
What to consider before installing
What to consider before installing/using: - Metadata mismatch: the registry lists no required env vars, but SKILL.md expects many sensitive provider keys and config files — ask for the source repo or homepage to verify origin. - Inspect the package before pip installing: the docs recommend `pip install genai-calling`; verify the PyPI project, source repository, and readable code before executing third-party code. - Protect credentials: store only the minimal provider keys you need. Prefer project-scoped keys and avoid placing long-lived/high-privilege keys in shared ~/.genai-calling/.env. - Beware of private-URL downloads: enabling GENAI_CALLING_ALLOW_PRIVATE_URLS or allowing arbitrary URL downloads can let the tool access internal network addresses (SSRF-like risk). Disable unless you understand the consequences. - MCP server caution: starting the MCP server can expose local services or accept remote requests if misconfigured (GENAI_CALLING_MCP_PUBLIC_BASE_URL, bearer tokens). Do not enable public exposure without reviewing config and firewall rules. - Use isolation: test the tool in an isolated environment (VM/container) and with low-privilege credentials before enabling it in production. - Source provenance: ask the publisher for a homepage/source repo and for the PyPI package name and checksum. If they cannot provide verifiable source, treat the package as higher risk. If you can provide the package name on PyPI or a source repository link, I can re-evaluate with higher confidence. If you need a checklist for safe inspection steps (pip metadata, file listing, quick code scan), tell me which environment you'll run in and I can provide one.

Like a lobster shell, security has layers — review code before you run it.

Current versionv1.0.2
Download zip
latestvk978n1httnknvt1kp5pcg6x0sd835mwd

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

SKILL.md

genai-calling

This skill is named genai-calling in this repository.

The runtime package is genai-calling, the Python import path is gravtice.genai, and the environment variable prefix is GENAI_CALLING_*.

Quick Start

IMPORTANT: If you rely on project-local .env.* files, run commands in the directory that contains those files (typically this skill base directory). If no project-local env file is present, the runtime also falls back to ~/.genai-calling/.env. If you pass runtime env vars (inline/export), working directory is not restricted.

# 1) Create `.env.local` in this skill directory
(cd "<SKILL_BASE_DIR>" && { test -f .env.local || touch .env.local; })

# 2) Edit `<SKILL_BASE_DIR>/.env.local` and set at least one provider key (see "Configuration Templates" and "Supported Environment Variables").
# Example (OpenAI):
#   OPENAI_API_KEY=...

# 3) Text
(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai --model openai:gpt-4o-mini --prompt "Hello")

# 4) See what you can use (requires at least one provider key configured)
(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai model available --all)

For user-wide defaults shared across projects, create ~/.genai-calling/.env and put provider credentials there. Project-local .env.* files still win.

If uvx is unavailable, install once and use genai directly:

python -m pip install --upgrade genai-calling
(cd "<SKILL_BASE_DIR>" && genai --model openai:gpt-4o-mini --prompt "Hello")

Configuration (Env Vars, Zero-parameter)

Configuration is managed via environment variables.

You can set env vars in two ways:

  1. Runtime env vars (inline before command, or export in shell)
  2. Env files (.env.local, .env.production, .env.development, .env.test, ~/.genai-calling/.env)

Recommended for this skill:

  • Put reusable provider credentials in ~/.genai-calling/.env
  • Put project-level overrides in <SKILL_BASE_DIR>/.env.local
  • Use runtime env vars for one-off overrides

Runtime example (inline):

(cd "<SKILL_BASE_DIR>" && OPENAI_API_KEY=... uvx --from genai-calling genai --model openai:gpt-4o-mini --prompt "Hello")

When env files are used, SDK/CLI/MCP loads them automatically with priority (high -> low):

  • .env.local > .env.production > .env.development > .env.test > ~/.genai-calling/.env

Process env vars override file-based config.

Minimal .env.local (OpenAI text only):

OPENAI_API_KEY=...
GENAI_CALLING_TIMEOUT_MS=120000

Minimal ~/.genai-calling/.env:

OPENAI_API_KEY=...
GOOGLE_API_KEY=
ANTHROPIC_API_KEY=

Notes:

  • Do not commit .env.local (add it to .gitignore if needed).
  • ~/.genai-calling/.env is user-level config and should hold only values you want shared across projects.
  • Provider credentials should use the shorter provider-specific variable names such as OPENAI_API_KEY.

Configuration Templates

Project-local .env.local example:

# Copy only what you need. Do not commit `.env.local`.

# --------------------
# Common
# --------------------
GENAI_CALLING_TIMEOUT_MS=120000
GENAI_CALLING_URL_DOWNLOAD_MAX_BYTES=134217728
# GENAI_CALLING_ALLOW_PRIVATE_URLS=1
# GENAI_CALLING_TRANSPORT=

# --------------------
# Providers
# --------------------
OPENAI_API_KEY=
GOOGLE_API_KEY=
ANTHROPIC_API_KEY=

ALIYUN_API_KEY=
ALIYUN_OAI_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1

VOLCENGINE_API_KEY=
VOLCENGINE_OAI_BASE_URL=https://ark.cn-beijing.volces.com/api/v3

TUZI_BASE_URL=https://api.tu-zi.com
# TUZI_OAI_BASE_URL=https://api.tu-zi.com/v1
# TUZI_GOOGLE_BASE_URL=https://api.tu-zi.com
# TUZI_ANTHROPIC_BASE_URL=https://api.tu-zi.com
TUZI_WEB_API_KEY=
TUZI_OPENAI_API_KEY=
TUZI_GOOGLE_API_KEY=
TUZI_ANTHROPIC_API_KEY=

# --------------------
# MCP Server
# --------------------
GENAI_CALLING_MCP_HOST=127.0.0.1
GENAI_CALLING_MCP_PORT=6001
GENAI_CALLING_MCP_PUBLIC_BASE_URL=
# GENAI_CALLING_MCP_BEARER_TOKEN=
# GENAI_CALLING_MCP_TOKEN_RULES=token1: [openai google]; token2: [openai:gpt-4o-mini]

User-wide ~/.genai-calling/.env example:

OPENAI_API_KEY=
GOOGLE_API_KEY=
ANTHROPIC_API_KEY=

Recommended usage:

  • Put shared provider credentials in ~/.genai-calling/.env
  • Put project-specific overrides and MCP settings in <SKILL_BASE_DIR>/.env.local
  • Keep only the providers and options you actually use

Supported Environment Variables

Common runtime

  • GENAI_CALLING_TIMEOUT_MS (default: 120000)
  • GENAI_CALLING_URL_DOWNLOAD_MAX_BYTES (default: 134217728)
  • GENAI_CALLING_ALLOW_PRIVATE_URLS (1/true/yes to allow private/loopback URL download)
  • GENAI_CALLING_TRANSPORT (internal transport marker; MCP server uses mcp, legacy sse is accepted)

Provider credentials

  • OpenAI: OPENAI_API_KEY
  • Google (Gemini): GOOGLE_API_KEY
  • Anthropic (Claude): ANTHROPIC_API_KEY
  • Aliyun (DashScope/百炼): ALIYUN_API_KEY
  • Volcengine (Ark/豆包): VOLCENGINE_API_KEY

Provider base URL overrides

  • ALIYUN_OAI_BASE_URL (default: https://dashscope.aliyuncs.com/compatible-mode/v1)
  • VOLCENGINE_OAI_BASE_URL (default: https://ark.cn-beijing.volces.com/api/v3)
  • TUZI_BASE_URL (default: https://api.tu-zi.com)
  • TUZI_OAI_BASE_URL (optional override)
  • TUZI_GOOGLE_BASE_URL (optional override)
  • TUZI_ANTHROPIC_BASE_URL (optional override)

Tuzi credentials

  • TUZI_WEB_API_KEY
  • TUZI_OPENAI_API_KEY
  • TUZI_GOOGLE_API_KEY
  • TUZI_ANTHROPIC_API_KEY

MCP server

  • GENAI_CALLING_MCP_HOST (default: 127.0.0.1)
  • GENAI_CALLING_MCP_PORT (default: 6001)
  • GENAI_CALLING_MCP_PUBLIC_BASE_URL
  • GENAI_CALLING_MCP_BEARER_TOKEN
  • GENAI_CALLING_MCP_TOKEN_RULES

Quick guidance:

  • Most users only need one provider key plus GENAI_CALLING_TIMEOUT_MS
  • Only set GENAI_CALLING_ALLOW_PRIVATE_URLS if you explicitly want to bypass private URL protection
  • Only set MCP variables when you run genai-mcp-server

Model Format

Model string is {provider}:{model_id} (example: openai:gpt-4o-mini).

Use this to pick a model by output modality:

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai model available --all)
# Look for: out=text / out=image / out=audio / out=video / out=embedding

If you have not configured any keys yet, you can still view the SDK curated list:

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai model sdk)

Common Scenarios

Image understanding

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai --model openai:gpt-4o-mini --prompt "Describe this image" --image-path "/path/to/image.png")

Image generation (save to file)

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai --model openai:gpt-image-1 --prompt "A red square, minimal" --output-path "/tmp/out.png")

Speech-to-text (transcription)

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai --model openai:whisper-1 --audio-path "/path/to/audio.wav")

Text-to-speech (save to file)

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai --model openai:tts-1 --prompt "你好" --output-path "/tmp/tts.mp3")

Python SDK

Install:

python -m pip install --upgrade genai-calling

Minimal example:

from gravtice.genai import Client, GenerateRequest, Message, OutputSpec, Part

client = Client()
resp = client.generate(
    GenerateRequest(
        model="openai:gpt-4o-mini",
        input=[Message(role="user", content=[Part.from_text("Hello")])],
        output=OutputSpec(modalities=["text"]),
    )
)
print(resp.output[0].content[0].text)

Note: Client() loads project-local .env.* from the current working directory and then falls back to ~/.genai-calling/.env; run your script in the directory that contains your project env files, or export env vars in the process environment.

MCP Server

Start server (Streamable HTTP: /mcp, SSE: /sse):

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai-mcp-server)

Recommended: set auth via runtime env vars, .env.local, or ~/.genai-calling/.env before exposing the server:

# GENAI_CALLING_MCP_BEARER_TOKEN=sk-...

Debug with MCP CLI:

(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai-mcp-cli env)
(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai-mcp-cli tools)
(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai-mcp-cli call list_providers)
(cd "<SKILL_BASE_DIR>" && uvx --from genai-calling genai-mcp-cli call generate --args '{"request":{"model":"openai:gpt-4o-mini","input":"Hello","output":{"modalities":["text"]}}}')

Troubleshooting

Missing/invalid API key (401/403)

Set provider credentials via runtime env vars, <SKILL_BASE_DIR>/.env.local, or ~/.genai-calling/.env (see "Supported Environment Variables"), then retry.

File input errors (mime type)

If you see cannot detect ... mime type, verify the path exists and is a valid image/audio/video file.

Timeout / long-running jobs

Increase GENAI_CALLING_TIMEOUT_MS (runtime env var, .env.local, or ~/.genai-calling/.env) and retry.

URL download blocked / SSRF protection

Binary outputs may be returned as URLs. Private/loopback URLs are rejected by default. Only if you understand the risk, set GENAI_CALLING_ALLOW_PRIVATE_URLS=1.

MCP auth (401 Unauthorized)

Set GENAI_CALLING_MCP_BEARER_TOKEN (or GENAI_CALLING_MCP_TOKEN_RULES) via runtime env var, .env.local, or ~/.genai-calling/.env, and ensure genai-mcp-cli uses the same token.

Files

1 total
Select a file
Select a file to preview.

Comments

Loading comments…