glm-output

v0.1.0

Read and process GLM output files. Use when you need to extract temperature data from NetCDF output, convert depth coordinates, or calculate RMSE against obs...

0· 76·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for wu-uk/glm-lake-mendota-glm-output.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "glm-output" (wu-uk/glm-lake-mendota-glm-output) from ClawHub.
Skill page: https://clawhub.ai/wu-uk/glm-lake-mendota-glm-output
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install glm-lake-mendota-glm-output

ClawHub CLI

Package manager switcher

npx clawhub@latest install glm-lake-mendota-glm-output
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
medium confidence
Purpose & Capability
The skill's name/description (GLM output processing, depth conversion, RMSE) align with the SKILL.md content. The SKILL.md contains Python code that does exactly that. Minor mismatch: the skill declares no dependencies, but the code requires python packages (netCDF4, numpy, pandas) which are not listed or documented.
Instruction Scope
Instructions are narrowly scoped to reading local GLM output (output/output.nc), converting z to depth using lake_depth from glm3.nml, reading a CSV of observations, and computing RMSE. The guide does not instruct reading unrelated system files or sending data to external endpoints. Concerns: the code hard-codes a start_date (datetime(2009,1,1,12,0,0)) instead of reading the NetCDF time units attribute (e.g., 'hours since YYYY-MM-DD...'), which can produce incorrect datetimes; the guide also assumes the user will supply lake_depth manually rather than parsing glm3.nml.
Install Mechanism
Instruction-only skill with no install spec and no code files — lowest install risk. No downloads or archive extraction are present.
Credentials
The skill requests no environment variables, credentials, or config paths. The operations in the SKILL.md use local files only (NetCDF and a CSV), which is proportionate to the stated purpose.
Persistence & Privilege
The skill is not always-enabled and does not request persistent system presence or modify other skills. Autonomous invocation is allowed by platform default but is not combined with other concerning privileges here.
Assessment
This skill appears to do what it claims, but check a few things before using it: (1) Install and verify required Python packages (netCDF4, numpy, pandas) — the SKILL.md does not declare them. (2) Do not assume the hard-coded start_date is correct — read the NetCDF time units (e.g., 'hours since YYYY-MM-DD HH:MM:SS') or derive the reference date programmatically to get correct datetimes. (3) Consider programmatically parsing glm3.nml to obtain lake_depth instead of entering it manually to avoid human error. (4) Verify that the local files referenced (output/output.nc and your observations CSV) are the intended, non-sensitive files. None of the instructions transmit data externally or request unrelated credentials, so the risk is primarily correctness/robustness rather than malicious behavior.

Like a lobster shell, security has layers — review code before you run it.

latestvk97frv8h8an8a4yzxnxszayj0d84xtp6
76downloads
0stars
1versions
Updated 1w ago
v0.1.0
MIT-0

GLM Output Guide

Overview

GLM produces NetCDF output containing simulated water temperature profiles. Processing this output requires understanding the coordinate system and matching with observations.

Output File

After running GLM, results are in output/output.nc:

VariableDescriptionShape
timeHours since simulation start(n_times,)
zHeight from lake bottom (not depth!)(n_times, n_layers, 1, 1)
tempWater temperature (°C)(n_times, n_layers, 1, 1)

Reading Output with Python

from netCDF4 import Dataset
import numpy as np
import pandas as pd
from datetime import datetime

nc = Dataset('output/output.nc', 'r')
time = nc.variables['time'][:]
z = nc.variables['z'][:]
temp = nc.variables['temp'][:]
nc.close()

Coordinate Conversion

Important: GLM z is height from lake bottom, not depth from surface.

# Convert to depth from surface
# Set LAKE_DEPTH based on lake_depth in &init_profiles section of glm3.nml
LAKE_DEPTH = <lake_depth_from_nml>
depth_from_surface = LAKE_DEPTH - z

Complete Output Processing

from netCDF4 import Dataset
import numpy as np
import pandas as pd
from datetime import datetime

def read_glm_output(nc_path, lake_depth):
    nc = Dataset(nc_path, 'r')
    time = nc.variables['time'][:]
    z = nc.variables['z'][:]
    temp = nc.variables['temp'][:]
    start_date = datetime(2009, 1, 1, 12, 0, 0)

    records = []
    for t_idx in range(len(time)):
        hours = float(time[t_idx])
        date = pd.Timestamp(start_date) + pd.Timedelta(hours=hours)
        heights = z[t_idx, :, 0, 0]
        temps = temp[t_idx, :, 0, 0]

        for d_idx in range(len(heights)):
            h_val = heights[d_idx]
            t_val = temps[d_idx]
            if not np.ma.is_masked(h_val) and not np.ma.is_masked(t_val):
                depth = lake_depth - float(h_val)
                if 0 <= depth <= lake_depth:
                    records.append({
                        'datetime': date,
                        'depth': round(depth),
                        'temp_sim': float(t_val)
                    })
    nc.close()

    df = pd.DataFrame(records)
    df = df.groupby(['datetime', 'depth']).agg({'temp_sim': 'mean'}).reset_index()
    return df

Reading Observations

def read_observations(obs_path):
    df = pd.read_csv(obs_path)
    df['datetime'] = pd.to_datetime(df['datetime'])
    df['depth'] = df['depth'].round().astype(int)
    df = df.rename(columns={'temp': 'temp_obs'})
    return df[['datetime', 'depth', 'temp_obs']]

Calculating RMSE

def calculate_rmse(sim_df, obs_df):
    merged = pd.merge(obs_df, sim_df, on=['datetime', 'depth'], how='inner')
    if len(merged) == 0:
        return 999.0
    rmse = np.sqrt(np.mean((merged['temp_sim'] - merged['temp_obs'])**2))
    return rmse

# Usage: get lake_depth from glm3.nml &init_profiles section
sim_df = read_glm_output('output/output.nc', lake_depth=25)
obs_df = read_observations('field_temp_oxy.csv')
rmse = calculate_rmse(sim_df, obs_df)
print(f"RMSE: {rmse:.2f}C")

Common Issues

IssueCauseSolution
RMSE very highWrong depth conversionUse lake_depth - z, not z directly
No matched observationsDatetime mismatchCheck datetime format consistency
Empty merged dataframeDepth rounding issuesRound depths to integers

Best Practices

  • Check lake_depth in &init_profiles section of glm3.nml
  • Always convert z to depth from surface before comparing with observations
  • Round depths to integers for matching
  • Group by datetime and depth to handle duplicate records
  • Check number of matched observations after merge

Comments

Loading comments...