Dicom Anonymizer

v0.1.0

Batch anonymize DICOM medical images by removing patient sensitive information (name, ID, birth date) while preserving image data for research use. Trigger w...

0· 109·0 current·0 all-time
byAIpoch@aipoch-ai
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name/description (batch DICOM anonymization) matches the included SKILL.md and scripts/main.py: code targets DICOM tags, uses pydicom, supports batch processing, hashing/pseudonyms, and audit logs. Required dependencies (pydicom) are appropriate and proportionate.
Instruction Scope
SKILL.md directs the agent to read DICOM files/directories, produce anonymized outputs and audit logs — all within the skill's stated purpose. There are no instructions to read unrelated system files or to transmit data externally in the provided materials.
Install Mechanism
No install spec is present (instruction-only), and dependencies are standard Python packages (pydicom). This minimizes installer risk; users should install dependencies from trusted PyPI sources.
Credentials
The skill requests no environment variables, credentials, or config paths. The functionality (local file processing and logging) does not require secrets, so the absence of requested credentials is appropriate.
Persistence & Privilege
always is false and the skill does not request persistent system privileges or to modify other skills. Autonomous invocation is allowed by default (disable-model-invocation is false), which is normal and expected.
Assessment
This appears to be a locally-run DICOM anonymizer and is internally consistent, but before using it on real patient data: 1) Review the full scripts/main.py (truncated portions) to confirm there are no hidden network calls, telemetry, or code paths that write PHI elsewhere. 2) Verify the hashing/pseudonymization approach (is a salt used? how are pseudonym mappings stored?) so hashes cannot be trivially correlated across datasets or leaked. 3) Ensure audit logs do not leak raw PHI (store them securely and limit access). 4) Test on synthetic data to confirm pixel data and burned-in text (if any) are handled as you expect — many anonymizers do not OCR burned-in patient identifiers. 5) Install pydicom and other dependencies from trusted sources (official PyPI) and run in an isolated environment until you're confident. If you want, I can scan the remaining/truncated parts of scripts/main.py for specific concerns (network calls, subprocess usage, writing to unexpected locations) — provide the full file and I'll review it.

Like a lobster shell, security has layers — review code before you run it.

latestvk97fzy6tptbxhetbh9zzh6ka15836x92

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments