Excel Data Import

v2.1.0

Import, merge, and transform data from Excel (.xlsx/.csv) files using YAML-driven configuration. Use when the user asks to: (1) import data from Excel/CSV in...

0· 196·2 current·2 all-time
bysniper-one@aqbjqtd

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for aqbjqtd/excel-data-import.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Excel Data Import" (aqbjqtd/excel-data-import) from ClawHub.
Skill page: https://clawhub.ai/aqbjqtd/excel-data-import
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Canonical install target

openclaw skills install aqbjqtd/excel-data-import

ClawHub CLI

Package manager switcher

npx clawhub@latest install excel-data-import
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name, description, SKILL.md, examples, and included Python scripts all align: this is an Excel/CSV import, mapping, validation, and batch-processing utility. No unrelated credentials or binaries are requested.
Instruction Scope
Runtime instructions focus on reading local Excel/CSV files, producing outputs/backups, and running the provided scripts (e.g., python3 scripts/excel_import.py). A few small inconsistencies appear in docs (e.g., 'uv add' in quickstart, reference to excel_import_enhanced.py in auto_header_detection.md) which may be stale/documentation errors but do not change the overall scope. The tool performs file I/O, directory traversal, backups and report generation as expected for its purpose.
Install Mechanism
No install spec; the bundle contains Python scripts and documentation that instruct users to pip-install openpyxl and pyyaml (and optionally python-calamine). No network-download/third-party install URLs are embedded in the skill metadata. The lack of an automatic install step means the user/host environment will install Python deps before execution.
Credentials
The skill declares no required environment variables or credentials. Some reference docs mention optional remote backup (S3) and encrypt-transform examples that would require user-supplied keys if used, but those are optional examples and not mandatory for normal operation.
Persistence & Privilege
Skill is not always-enabled and uses the normal model-invocation defaults. It does file reads/writes within user-specified paths (backups, logs, outputs) but does not request system-wide privileges or modify other skills' configurations.
Assessment
This skill appears to be what it claims: a local Python-based Excel/CSV import and mapping toolkit. Before running it: (1) review the included Python scripts to ensure they only access the files/paths you expect, (2) install dependencies in a controlled environment (virtualenv), (3) run with --dry-run and small test data first, (4) inspect any backup/remote-backup settings before enabling remote uploads (you must provide cloud credentials yourself), and (5) note minor documentation inconsistencies (typo 'uv add', a reference to excel_import_enhanced.py) that suggest verifying the exact script names/paths in the bundle. If you will run this on sensitive data, audit the code for any network calls or unexpected logging of sensitive fields before use.

Like a lobster shell, security has layers — review code before you run it.

latestvk97ej9fshkfv3abk3h1eq1me65834dwp
196downloads
0stars
2versions
Updated 1mo ago
v2.1.0
MIT-0

Excel Data Import

Configuration-driven data import from Excel and CSV files with field mapping, validation, and batch processing.

Prerequisites

  • Python 3.8+
  • Required: pip3 install openpyxl pyyaml
  • Optional: pip3 install python-calamine (for .xls legacy format)

Quick Start

# import_config.yaml
task_name: "人员信息导入"
source:
  file_path: "data/source.xlsx"
  sheet_name: "Sheet1"
  header_row: 1
  key_field: "身份证号"
target:
  file_path: "output/result.xlsx"
  sheet_name: "人员信息"
  header_row: 2
  data_start_row: 3
field_mappings:
  - source: "姓名"
    target: "员工姓名"
    required: true
  - source: "身份证号"
    target: "身份证号码"
    required: true
    validate: "id_card"
  - source: "部门"
    target: "所属部门"
    default: "待分配"
error_handling:
  backup: true
python3 scripts/excel_import.py import_config.yaml
python3 scripts/excel_import.py import_config.yaml --dry-run   # preview only

Import Modes

ModeSource ConfigUse Case
Single filesource.file_pathOne-to-one import
Directory batchsource.type: "directory"Process all files in a folder
Multi-sourcesources: [...]Merge from multiple files
CSV.csv file_pathAuto-encoding detection (UTF-8/GBK/GB2312)
Legacy .xls.xls file_pathRequires python-calamine
Auto headerheader_row: "auto"Detect header in complex sheets

For full parameter docs, see data-mapping-guide.md.

Key Features

  • Incremental update: Match by key_field, update existing or append new rows
  • Multi-layer merged headers: Auto-detect and expand merged cell values
  • Validation rollback: Failed rows are skipped entirely (no partial writes)
  • Source deduplication: Duplicate keys across files are merged
  • Auto-create target: Template generated from field_mappings if missing

Built-in Transforms & Validators

Transforms: strip, upper, lower, title, int, float, date

Validators: required, not_empty, id_card, phone, email, numeric, range, regex, length

For advanced usage, see advanced-features.md.

CLI Options

OptionDescription
--dry-runPreview mode, no file writes
--verboseDetailed per-record output
--no-backupSkip target file backup

Reference Documents

Workflow

  1. Read user's import requirements and source/target file info
  2. Create or adjust YAML config file
  3. Run python3 scripts/excel_import.py <config.yaml> with --dry-run first
  4. Review output, fix issues, then run without --dry-run
  5. Check the JSON report alongside the output file

Comments

Loading comments...