Korean Gov Programs
ReviewAudited by ClawScan on May 10, 2026.
Overview
The skill mostly matches its stated purpose, but the stats helper unsafely embeds a user-supplied path into Python code, which could run unintended code with a crafted directory name.
This skill appears to be a public-data crawler and does not request credentials, but review it before installing because one helper script can mishandle crafted directory names. If you use it, run the collector with a simple trusted output path such as ./data, avoid running stats.sh on untrusted paths, and verify that the installed package includes the expected scripts.
Findings (3)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
If an agent or user runs the stats helper on an untrusted, specially named local directory, commands could execute with the user's permissions.
The script derives CHECKPOINT from a user-controlled argument and embeds it directly inside Python source code instead of passing it as argv or safely escaping it. A crafted directory name could break the string literal and potentially inject Python code.
DATA_DIR="${1:-./data}"
CHECKPOINT="$DATA_DIR/.checkpoint.json"
...
python3 -c "
import json
with open('$CHECKPOINT', encoding='utf-8') as f:Use only simple trusted output paths for now. The maintainer should change the checkpoint-reading python3 -c block to pass the path as an argument, e.g. python3 - "$CHECKPOINT", and read sys.argv[1].
The package metadata is inconsistent, so users have less assurance that the installed package, reviewed files, and documented scripts all come from the same expected release.
The registry/SKILL metadata describes version 1.0.8 and author 'raon', while package.json says 1.0.7 with a different author and would publish only SKILL.md despite the skill relying on scripts.
"version": "1.0.7", "author": "Yeomyeonggeori Inc. <iam@dawn.kim>", "files": ["SKILL.md"]
Verify the installed files before use. The publisher should align version/author metadata and include the scripts in package metadata or provide a clear source repository.
Running the collector will contact public government websites and create or append files in the selected output directory.
The skill tells users to run local scripts that fetch data and write JSONL/checkpoint files. This is expected for the stated crawler purpose, but users should understand it performs local file operations and web requests.
python3 scripts/collect.py --output ./data ... **APPEND 전용**: 기존 파일 덮어쓰기 절대 없음 ... .checkpoint.json에 진행 상태 저장
Run it from a trusted checkout and choose a dedicated output directory that does not contain unrelated important files.
