Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Llm Models

v0.1.5

Access Claude, Gemini, Kimi, GLM and 100+ LLMs via inference.sh CLI using OpenRouter. Models: Claude Opus 4.5, Claude Sonnet 4.5, Claude Haiku 4.5, Gemini 3...

0· 1.3k·5 current·5 all-time
byÖmer Karışman@okaris
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Benign
medium confidence
Purpose & Capability
The name/description (access many LLMs via inference.sh/OpenRouter) matches the SKILL.md examples and commands. Required binaries/env vars are not declared because the skill is instruction-only and expects the inference.sh CLI and an account login, which is proportionate to the stated purpose.
Instruction Scope
Instructions are narrowly scoped to installing the inference.sh CLI, running login, and calling infsh app run/sample commands. They do not ask the agent to read arbitrary system files or unrelated credentials, but they do instruct running a remote installer (curl | sh) and performing an interactive login (implying you will provide credentials).
!
Install Mechanism
The SKILL.md tells users to run a remote installer via curl -fsSL https://cli.inference.sh | sh which downloads and installs a binary from dist.inference.sh. The doc mentions SHA-256 checksums, but running a piped install is a moderate risk — the installer writes executables to disk and you should verify checksums and the script contents before executing.
Credentials
No environment variables are declared in the registry metadata, yet the instructions require an interactive 'infsh login' (so you'll supply an API key or account credentials). This is expected for a CLI that talks to OpenRouter/inference.sh and is not disproportionate, but the skill does not declare the auth variables or storage location explicitly.
Persistence & Privilege
always is false and there are no OS restrictions. The installer will create a local CLI binary (persistent on disk), which is normal for a CLI-based integration. The skill does not request elevated platform privileges or modify other skills.
Assessment
This skill appears to be what it claims: a guide to using the inference.sh CLI to access many models. Before installing: 1) Inspect the installer script at https://cli.inference.sh (do not blindly run curl | sh). 2) Verify the downloaded binary's SHA-256 against the published checksums on dist.inference.sh. 3) Use API credentials only for the intended provider (inference.sh/OpenRouter) and avoid reusing high-privilege keys. 4) Consider installing in a sandbox or container if you want to limit risk. If you need the skill to run non-interactively, ask the author how credentials are expected to be provided and stored.

Like a lobster shell, security has layers — review code before you run it.

latestvk972x2phc41rr554ry553d27f981d5nf

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments