Clay

v1.0.0

Run data quality checks on PMU (Phasor Measurement Unit) data. Use when the user asks to validate, check, or audit PMU measurements including frequency, volt...

0· 135·0 current·0 all-time
byJiahui (Clay) Yang@clayutk

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for clayutk/pmu-data-quality-skill.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Clay" (clayutk/pmu-data-quality-skill) from ClawHub.
Skill page: https://clawhub.ai/clayutk/pmu-data-quality-skill
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install pmu-data-quality-skill

ClawHub CLI

Package manager switcher

npx clawhub@latest install pmu-data-quality-skill
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description match the included files: SKILL.md, a sample CSV, a limits config, and a Python script that implements frequency, voltage, phasor angle, missing-data, and timestamp-gap checks. All requested actions (reading a CSV, applying configurable limits, producing reports) are appropriate for a PMU data quality tool.
Instruction Scope
Runtime instructions restrict the skill to asking for a PMU CSV, checking columns, running the provided Python script, and reporting local outputs. The SKILL.md does not instruct the agent to read unrelated system files, environment variables, or send data to external endpoints.
Install Mechanism
This is an instruction-only skill with an included Python script (no install spec). The script imports pandas and numpy but the skill does not declare these dependencies or require 'python' as a runtime binary; users should ensure a compatible Python environment with required packages is available before running.
Credentials
The skill requests no environment variables, credentials, or config paths. All file access is limited to user-supplied CSVs, the included templates, and locally produced reports, which is proportionate to the stated purpose.
Persistence & Privilege
always is false and the skill does not request persistent or elevated privileges. It does write output files (flagged CSV and optional HTML) alongside input files, which is expected for a reporting tool.
Assessment
This skill appears to be a straightforward local PMU CSV quality checker. Before installing/running: 1) Review the full script if you need to be certain it won't access other files or networks (the visible code imports pandas/numpy and writes local reports). 2) Run it in an environment with Python and the required libraries (pandas, numpy) installed or in a sandbox if the data is sensitive. 3) Be aware it reads whatever CSV path you provide and writes flagged output next to it — avoid pointing it at directories containing unrelated sensitive files. 4) If you plan to use it in production, confirm the limits_config.json matches your system and review any further code not shown to ensure no unexpected network or subprocess calls.

Like a lobster shell, security has layers — review code before you run it.

latestvk97dfvkz34y8srcexnbkey0qdh83jyrv
135downloads
0stars
1versions
Updated 1mo ago
v1.0.0
MIT-0

PMU Data Quality Checker

Performs automated data quality checks on PMU (Phasor Measurement Unit) CSV data files against configurable security limits defined in IEEE/NERC standards.

What This Skill Checks

  1. Frequency Data — Checks if frequency measurements stay within the nominal range (default: 59.95–60.05 Hz for 60 Hz systems)
  2. Voltage Magnitude Data — Checks if voltage magnitudes stay within acceptable per-unit limits (default: 0.95–1.05 pu)
  3. Phasor Angle Data — Checks if phasor angles stay within expected bounds (default: -180° to +180°, with rate-of-change check)
  4. Missing / NaN Data — Flags rows with missing or null values
  5. Timestamp Continuity — Detects gaps in the reporting rate (e.g., expected 30 samples/sec or 60 samples/sec)

How to Use

Quick check on a CSV file:

python <skill_base_path>/scripts/pmu_quality_check.py path/to/data.csv

With custom limits config:

python <skill_base_path>/scripts/pmu_quality_check.py path/to/data.csv --config <skill_base_path>/templates/limits_config.json

On the template sample data (for testing):

python <skill_base_path>/scripts/pmu_quality_check.py <skill_base_path>/templates/sample_pmu_data.csv

Expected CSV Format

The input CSV should have columns similar to:

timestampfrequencyvoltage_magvoltage_anglecurrent_magcurrent_angle
  • timestamp — ISO 8601 or Unix epoch
  • frequency — in Hz
  • voltage_mag — in per-unit (pu) or kV (specify in config)
  • voltage_angle / current_angle — in degrees
  • current_mag — in per-unit (pu) or Amps

Column names are configurable via the limits config JSON. If the user's CSV uses different column names (like those from openHistorian or FNET exports), update the column_mapping section in the config.

Output

The script produces:

  • A summary report printed to stdout
  • A flagged rows CSV saved alongside the input (e.g., data_flagged.csv)
  • An optional HTML report with charts if --html flag is passed

Workflow

  1. Ask the user for their PMU data file (CSV)
  2. Check if the column names match the expected format; if not, ask or auto-detect
  3. Run the quality check script
  4. Present the summary and ask if the user wants to adjust limits or dig deeper into flagged data

Comments

Loading comments...