Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

HPR Solver

v0.0.9

Solve optimization & planning problems using natural language. Just describe what you need — fast, accurate, and built for AI agents.

1· 287·0 current·0 all-time
byjiawei_polyu@ljw2024polyu

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for ljw2024polyu/hpr-solver.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "HPR Solver" (ljw2024polyu/hpr-solver) from ClawHub.
Skill page: https://clawhub.ai/ljw2024polyu/hpr-solver
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install hpr-solver

ClawHub CLI

Package manager switcher

npx clawhub@latest install hpr-solver
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
Name/description claim LP solving with HPR-LP and Julia, which matches the Julia/JuMP usage and hprlp_solve.sh script. However, the package also contains a Python 'hybrid' parser that calls an external LLM endpoint (openrouter.ai) to generate Julia code; that networked LLM fallback is not mentioned in SKILL.md or README and is not obviously required for an HPR-LP solver (the skill could parse NL locally or ask the user). The README additionally claims "No custom code is executed," which contradicts the presence of scripts in the bundle.
!
Instruction Scope
SKILL.md instructs running Julia and local solver scripts and does not describe any remote LLM calls. The included scripts (scripts/hpr_hybrid.py) contain an LLM_API_URL and model name and appear to send problem text to a third-party API as a fallback parser — this could transmit user problem text externally. The shell script uses absolute user-specific paths (/home/ljw/...), which differ from the tilde-paths in SKILL.md and README and could fail or lead to unintended file access if executed as-is.
Install Mechanism
No formal install spec in registry metadata (instruction-only), which is low-risk. README suggests manual git clone of upstream HPR-LP and installing Julia — these are from official sources. However, the bundle does include executable scripts that would be run; there is no automated installer but the code would be executed by the agent if invoked.
!
Credentials
Metadata declares no required env vars, but the Python script references an external LLM endpoint and likely needs an API key (not declared). The skill may therefore expect credentials (e.g., OpenRouter API key) or rely on network access without disclosing it in SKILL.md/README. That mismatch between declared environment requirements and actual code is concerning because it can lead to unintended secret use or data exfiltration.
Persistence & Privilege
The skill is not marked always:true and does not request system-wide persistence. It does contain scripts that, if executed, call local Julia binaries and run solver scripts; there is no evidence it modifies other skills or agent-wide settings. The hard-coded /home/ljw paths are brittle but not an elevated privilege request.
What to consider before installing
This skill contains local solver scripts (expected) but also a Python 'hybrid' parser that calls an external LLM (openrouter.ai) and includes hard-coded user paths. Before installing or using it: 1) Inspect scripts/hpr_hybrid.py fully to see whether it sends problem text or other data to the remote LLM and whether it expects an API key in an env var (the skill metadata does not declare any). 2) If you don't want your problem descriptions sent outside your environment, do not run the hybrid parser; instead run the local Julia hprlp_solve.jl directly after installing Julia and HPR-LP manually. 3) Fix hard-coded paths (e.g., /home/ljw) to point to your environment or use the tilde-based paths described in the README. 4) Ask the maintainer to document any remote network calls and required environment variables (API keys) in SKILL.md and to remove or make the LLM fallback explicit and opt-in. If the author confirms there are no remote calls or that the LLM fallback is removed/opt-in, the concerns would be largely resolved.

Like a lobster shell, security has layers — review code before you run it.

latestvk97egc4pnc7m0azaqgsdk485kh8388kp
287downloads
1stars
12versions
Updated 26m ago
v0.0.9
MIT-0

HPR Solver

Solve Linear Programming problems using HPR solver.

Trigger

When user wants to solve an LP problem (MPS file or natural language description).

Usage

For MPS Files

User provides path to .mps file. Confirm parameters first:

⚠️ Please confirm parameters:
1. stoptol (default 1e-6): ?
2. time_limit (default 3600s): ?
3. device_number (0=GPU, -1=CPU): ?
4. Need variable values? (Yes/No)

After confirmation, run:

~/julia/julia-1.10.4/bin/julia --project ~/.openclaw/workspace/HPR-LP \
  ~/.openclaw/workspace/HPR-LP/hprlp_solve.jl <mps_file> <time_limit> <stoptol> <device>

For Natural Language

  1. Parse problem, output mathematical model for confirmation:
📐 Mathematical Model:

```max
[objective function]
[constraint 1]
[constraint 2]
[constraint 3]

Variables:

  • x₁ = [description]
  • x₂ = [description]

2. After user confirms, ask parameters (same as MPS)

3. Model in Julia/JuMP:

```julia
using JuMP
using HPRLP

model = Model(HPRLP.Optimizer)
set_optimizer_attribute(model, "stoptol", <value>)
set_optimizer_attribute(model, "time_limit", <value>)
set_optimizer_attribute(model, "device_number", <value>)
set_optimizer_attribute(model, "verbose", true)

@variable(model, x1 >= 0)
@variable(model, x2 >= 0)

@constraint(model, c1, <constraint 1>)
@constraint(model, c2, <constraint 2>)

@objective(model, Max, <objective>)

optimize!(model)
  1. Output Solution Summary:
📊 HPR-LP Results

=== Solution Summary ===
Status: [OPTIMAL/INFEASIBLE/...]
Iterations: <count>
Solve Time: <seconds>
Primal Objective: <value>
Dual Objective: <value>
KKT Error: <error>

=== Variables ===
x₁ = <value>
x₂ = <value>

Parameters

ParameterDefaultDescription
stoptol1e-6Stopping tolerance
time_limit3600Time limit (seconds)
device_number0GPU device (-1 for CPU)

Non-LP Problems

If problem is NOT linear (has integer vars, x², products, etc.), respond:

⚠️ HPR only supports Linear Programming (LP).

This appears to be:
- Integer/MILP (use GLPK, CBC, HiGHS)
- Non-linear (use Ipopt)
- Quadratic (use Gurob, CPLEX)

Requirements

  • Julia 1.10.4
  • HPR-LP
  • Linux/macOS/Windows

Comments

Loading comments...