Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
Box-KVCache
v1.1.0Local KV Cache compression for LLMs using low-rank decomposition and INT8 quantization to reduce GPU memory by 2-4x during inference.
⭐ 0· 8·0 current·0 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Suspicious
medium confidencePurpose & Capability
Name/description match the included code: the scripts implement low-rank SVD compression and INT8 quantization for KV caches and helpers to detect/run Ollama/llama.cpp. However, the SKILL.md claims cross-platform support (Windows, Linux, macOS) while the scripts are largely Windows-biased (use 'tasklist | findstr', 'where', PowerShell fallbacks). The SKILL.md also documents OLLAMA_* environment variables as useful, but none are required in the registry metadata and the scripts do not actually read OLLAMA_HOST / OLLAMA_MODELS / OLLAMA_KEEP_ALIVE — this is an internal inconsistency.
Instruction Scope
Runtime instructions and scripts stay within the stated purpose (environment detection, compression, quantization, and launching Ollama). They run local subprocess commands (ollama, nvidia-smi, pip, systeminfo/tasklist/where) and perform on-disk saves/loads of compressed arrays. A few minor issues: several commands use shell=True in run_cmd (which can be risky if later passed untrusted input), and some Windows-only commands are used despite cross-platform claims. There is no evidence the scripts attempt to read unrelated credentials or exfiltrate data to remote endpoints.
Install Mechanism
No install specification is provided (instruction-only in registry), and all code is included in the bundle. Nothing is downloaded from external URLs during installation. This limits supply-chain risk compared with remote downloads.
Credentials
The skill declares no required environment variables or credentials in registry metadata (good). SKILL.md documents optional OLLAMA_* variables but they are informational only — the scripts do not read those variables. No secrets or unrelated credentials are requested. This mismatch (documented env vars vs actual usage) is inconsistent but not directly dangerous.
Persistence & Privilege
The skill does not request always:true and does not modify other skills or system-wide configurations. It can start/launch an Ollama local service (calls 'ollama serve' and runs 'ollama run'), which is expected for this functionality but means it will start local processes if you run it.
What to consider before installing
This package appears to implement the described KV-cache compression algorithms and helper scripts, but review before running:
- Inspect scripts locally (they are included) and run them in a sandbox or non-production environment first.
- Note platform bias: many checks use Windows commands; Linux/macOS behavior may be limited. Test on your target OS.
- Be aware scripts invoke shell commands (subprocess with shell=True in run_cmd). While current commands are internal, avoid running with elevated privileges and avoid passing untrusted input into those helpers.
- The README/SKILL.md mention OLLAMA_* env vars but the scripts do not read them — if you depend on custom Ollama host/settings verify the tools actually honor them.
- The tool will start/launch local Ollama processes; confirm your Ollama installation and model binaries are from trusted sources and you are comfortable running local services.
If you want higher confidence, ask the author for: (1) explicit support matrix for Linux/macOS, (2) clarification whether OLLAMA_* env vars are used and how, and (3) a non-Windows command-path implementation for environment detection.Like a lobster shell, security has layers — review code before you run it.
latestvk970tpecyq1v8vfexv80t3gzzx848v1k
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
