Back to skill
Skillv2.2.1
ClawScan security
mlx-local-inference · ClawHub's context-aware review of the artifact, metadata, and declared behavior.
Scanner verdict
BenignMar 17, 2026, 9:17 AM
- Verdict
- benign
- Confidence
- medium
- Model
- gpt-5-mini
- Summary
- The skill's instructions, required tools, and runtime behavior match its stated purpose (running local models via an oMLX gateway and uv); nothing requested is disproportionate, though the README-style installer command (curl | sh) and external installer origin deserve user verification.
- Guidance
- This skill appears to do what it says: operate local ML models via a local oMLX gateway and the 'uv' runner. Before installing or following the SKILL.md: 1) Verify the 'uv' installer URL (https://astral.sh) is legitimate to you — avoid running arbitrary curl | sh unless you trust the source; prefer a package manager if available. 2) Confirm you have enough disk space and have placed large model files under ~/models as described. 3) The skill talks to localhost:8000 only; ensure that oMLX is intentionally running and not exposed to untrusted networks. 4) Restarting via launchctl affects only your user service; it requires appropriate user permissions but is not a system-wide change. If you need higher assurance, ask the skill author for a formal install spec or an auditable package (Homebrew formula or repository) and for cryptographic checksums for any model or installer downloads.
Review Dimensions
- Purpose & Capability
- okName/description (local inference via oMLX and uv) align with the runtime instructions: calls to localhost:8000, uv run invocations, and references to ~/models are consistent with running models locally on macOS.
- Instruction Scope
- okSKILL.md only instructs the agent to call a local HTTP API, run uv to invoke Python model libraries, read model files under ~/models, and use launchctl for the local oMLX service. It does not attempt to read unrelated system files, request unrelated environment variables, or exfiltrate data to remote endpoints.
- Install Mechanism
- noteThere is no formal install spec in the registry; the SKILL.md recommends installing uv via 'curl -LsSf https://astral.sh/uv/install.sh | sh'. Download-and-exec installer instructions are common for CLIs but are higher risk than package manager installs — verify the source before running. No other installers or remote code downloads are required by the skill itself.
- Credentials
- okThe skill requests no environment variables or credentials and only requires the 'uv' binary and an Apple Silicon macOS environment, which is proportionate to local model execution. Model files are referenced under the user's home (~), which is expected.
- Persistence & Privilege
- okThe skill is not always-on, does not request elevated platform privileges, and does not modify other skills or system-wide configs beyond invoking a user launchctl command to restart the local oMLX service (which affects only the user's launchd job).
