Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

SQL Guard Copilot

Simplify SQL querying and troubleshooting for MySQL, PostgreSQL, and SQLite. Use when users ask to inspect schema, convert natural language to SQL, debug SQL...

MIT-0 · Free to use, modify, and redistribute. No attribution required.
0 · 114 · 0 current installs · 0 all-time installs
MIT-0
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The name/description (SQL helper for MySQL/Postgres/SQLite) matches the included code and commands. However, the registry metadata declares no required environment variables or primary credential while the runtime instructions and code require SQL_DSN for DB access and optionally OPENAI_API_KEY / OPENAI_BASE_URL for natural-language 'ask' mode. That metadata omission reduces transparency.
!
Instruction Scope
SKILL.md and the CLI prominently instruct setting SQL_DSN and (for 'ask') OPENAI_API_KEY and base URL. The 'ask' flow builds a schema prompt (tables/columns) and sends it to an LLM endpoint (default https://api.openai.com or any user-supplied base URL). That means database schema — and potentially query text or results depending on options — will be transmitted off-host. The skill otherwise enforces read-only guards and blocks DDL/DML tokens, which is appropriate, but the external transmission of schema/data is a significant privacy/exfiltration risk and should be explicitly considered before use.
Install Mechanism
No install spec (instruction-only with a single Python script). Dependencies (pymysql, psycopg/psycopg2) are imported at runtime and only required for corresponding DB drivers; the script raises clear errors if a dependency is missing. No remote arbitrary downloads are used in the provided files.
!
Credentials
The skill requires sensitive inputs at runtime: SQL_DSN (which contains DB host, username, and password in the examples) and optionally OPENAI_API_KEY/OPENAI_BASE_URL. The registry did not declare these env vars. Asking for an LLM API key and sending schema/data to an LLM is proportional for the 'ask' capability but increases risk: use of an unrestricted base URL allows pointing to arbitrary endpoints. Audit logging to a JSONL file is available and should be configured carefully.
Persistence & Privilege
always is false and the skill does not request persistent platform-wide privileges. It does write optional local audit logs (user-specified path) but does not modify other skills or system-wide config. Agent autonomous invocation is allowed by default (normal), but combine this with the data-exfil risk when enabling autonomous runs.
What to consider before installing
This skill appears to be a legitimate SQL helper, but review these points before installing or using it: - Expect to supply SQL_DSN (DB credentials) and, for the natural-language 'ask' feature, an OPENAI_API_KEY and OPENAI_BASE_URL. The registry metadata did not declare these — verify you are comfortable passing credentials via environment variables or CLI. - The 'ask' command sends schema and prompts to an external LLM endpoint (defaults to api.openai.com). That will expose table/column names and possibly query text/results to the remote model. If your data or schema are sensitive, do not use 'ask' against a remote LLM; consider using --dry-run, a local/private LLM endpoint, or disabling 'ask'. - Use a least-privilege, read-only DB user in SQL_DSN. Test against a non-sensitive sample DB first to confirm the read-only guard blocks writes and DDL as promised. - If you must allow LLM access, set OPENAI_BASE_URL to a trusted host, restrict --max-tables/--max-columns, and review the prompt with --show-prompt before execution. - Enable audit logging to a secure location (SQL_EASY_AUDIT_LOG or --audit-log) and inspect logs for any unexpected data capture. - Review/inspect scripts/sql_easy.py yourself (or have a trusted reviewer do so) before running in production, and pin dependency versions when installing runtime libraries.

Like a lobster shell, security has layers — review code before you run it.

Current versionv0.2.0
Download zip
databasevk97en1c1xyx37efq5ayk1f648582qdtnlatestvk97en1c1xyx37efq5ayk1f648582qdtnmysqlvk97en1c1xyx37efq5ayk1f648582qdtnnl2sqlvk97en1c1xyx37efq5ayk1f648582qdtnpostgresvk97en1c1xyx37efq5ayk1f648582qdtnsafetyvk97en1c1xyx37efq5ayk1f648582qdtnsqlvk97en1c1xyx37efq5ayk1f648582qdtnsqlitevk97en1c1xyx37efq5ayk1f648582qdtn

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

SKILL.md

SQL Query Copilot

Overview

Use this skill to turn plain-language requests into executable SQL with a predictable, low-risk workflow. Default to read-only execution and validate every query against schema before running.

Quick Start

Set SQL_DSN first (or pass --dsn each time).

# PowerShell
$env:SQL_DSN="mysql://user:password@127.0.0.1:3306/stock_monitor"
$env:SQL_DSN="postgres://user:password@127.0.0.1:5432/stock_monitor"
$env:SQL_DSN="sqlite:///d:/data/demo.db"

# Windows CMD
set SQL_DSN=mysql://user:password@127.0.0.1:3306/stock_monitor
set SQL_DSN=postgres://user:password@127.0.0.1:5432/stock_monitor
set SQL_DSN=sqlite:///d:/data/demo.db

# Bash / Zsh
export SQL_DSN="mysql://user:password@127.0.0.1:3306/stock_monitor"
export SQL_DSN="postgres://user:password@127.0.0.1:5432/stock_monitor"
export SQL_DSN="sqlite:///d:/data/demo.db"

Core commands:

python scripts/sql_easy.py tables
python scripts/sql_easy.py describe daily_kline
python scripts/sql_easy.py lint --sql "SELECT * FROM daily_kline"
python scripts/sql_easy.py explain --sql "SELECT code, close FROM daily_kline WHERE trade_date >= '2026-01-01'"
python scripts/sql_easy.py query --sql "SELECT code, close FROM daily_kline ORDER BY trade_date DESC" --limit 50
python scripts/sql_easy.py query --sql "SELECT code, close FROM daily_kline" --summary
python scripts/sql_easy.py ask --q "show symbols with old sell signals older than 20 days" --summary
python scripts/sql_easy.py profile

Set OPENAI_API_KEY (or pass --api-key) to use ask.

v0.2 Highlights

  • Multi-engine support: MySQL, PostgreSQL, SQLite.
  • SQL lint engine: catches high-risk patterns before execution.
  • Explain mode: quickly inspect query plan (EXPLAIN / EXPLAIN QUERY PLAN).
  • Natural-language mode: ask generates SQL from user intent.
  • Query summary: auto profile returned columns (null ratio, distinct count, min/max/avg).
  • Slow query warning: highlights expensive queries using --slow-ms.
  • Audit log: write command metadata to JSONL via --audit-log or SQL_EASY_AUDIT_LOG.

Workflow

  1. Clarify the metric and grain. Ask for time window, dimensions, and output columns before writing SQL.

  2. Discover schema first. Run tables, describe <table>, and profile before any complex SQL.

  3. Draft SQL in read-only mode. Use SELECT or WITH; keep columns explicit and add time filters.

  4. Execute with guardrails. Run via scripts/sql_easy.py query, keep --limit unless full export is explicitly needed.

  5. Validate results. Cross-check row count, null ratio, and edge dates; adjust query and rerun.

Guardrails

  • Default to read-only SQL.
  • Reject destructive statements (INSERT, UPDATE, DELETE, DROP, ALTER, TRUNCATE, etc.).
  • Prefer explicit columns over SELECT * for production/report queries.
  • Run lint before heavy or scheduled queries.
  • Run explain before approving complex joins/window queries.
  • Always quote identifiers when table/column names are uncertain.
  • For business decisions, provide both SQL and a short interpretation of returned data.

Query Patterns

Read references/query_patterns.md when creating:

  • Top-N and ranking queries
  • Time-window aggregation
  • Dedup with window functions
  • Funnel-style conditional counts
  • Data quality checks (null/duplicate/outlier)

Read references/chanquant_templates.md for Chanquant-specific query templates.

Files

6 total
Select a file
Select a file to preview.

Comments

Loading comments…