Datasets
Browse and load ready-to-use AI/ML datasets with fast manipulation. Use when searching datasets, loading training data, transforming formats.
MIT-0 · Free to use, modify, and redistribute. No attribution required.
⭐ 0 · 100 · 0 current installs · 0 all-time installs
byBytesAgain2@ckchzh
MIT-0
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description (dataset ingestion, transforms, query, export) align with what is provided: a bash CLI that logs operations into per-command .log files under ~/.local/share/datasets. There are no extra binaries, cloud credentials, or unrelated capabilities requested.
Instruction Scope
SKILL.md instructs the agent to run local CLI commands and explains local storage and exports. The included script only reads/writes files inside the declared data directory and uses common Unix utilities; it does not read unrelated system files, access network endpoints, or attempt to exfiltrate environment variables.
Install Mechanism
There is no install spec (instruction-only skill) and the provided code is a single bash script. Nothing is downloaded at install time and no archives or remote installers are used.
Credentials
The skill declares no required env vars, credentials, or config paths. The script uses HOME to determine a local data directory (expected) and does not request or use secrets or external service keys.
Persistence & Privilege
The skill is not always-enabled and is user-invocable. It does not modify other skills or global agent settings. Its persistence is limited to creating and updating files under the user's ~/.local/share/datasets directory.
Assessment
This tool is local-only and appears safe to use, but review the script before installing: it will create and append logs under ~/.local/share/datasets (history.log and per-command .log files) so don't log sensitive secrets into it. The JSON export code does not escape values and may produce invalid JSON if entries contain quotes/newlines — treat exports as potentially containing raw user data. Also confirm you have the full, untruncated script (the provided snippet looked cut off mid-function in the listing); if you plan to allow autonomous invocation, be aware it can write files to your home directory whenever invoked.Like a lobster shell, security has layers — review code before you run it.
Current versionv2.0.0
Download ziplatest
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
SKILL.md
Datasets
A data processing toolkit for ingesting, transforming, querying, and managing dataset entries from the command line. All operations are logged with timestamps and stored locally.
Commands
Data Operations
Each data command works in two modes: run without arguments to view recent entries, or pass input to record a new entry.
| Command | Description |
|---|---|
datasets ingest <input> | Ingest data — record a new ingest entry or view recent ones |
datasets transform <input> | Transform data — record a transformation or view recent ones |
datasets query <input> | Query data — record a query or view recent ones |
datasets filter <input> | Filter data — record a filter operation or view recent ones |
datasets aggregate <input> | Aggregate data — record an aggregation or view recent ones |
datasets visualize <input> | Visualize data — record a visualization or view recent ones |
datasets export <input> | Export data — record an export entry or view recent ones |
datasets sample <input> | Sample data — record a sample or view recent ones |
datasets schema <input> | Schema management — record a schema entry or view recent ones |
datasets validate <input> | Validate data — record a validation or view recent ones |
datasets pipeline <input> | Pipeline management — record a pipeline step or view recent ones |
datasets profile <input> | Profile data — record a profile or view recent ones |
Utility Commands
| Command | Description |
|---|---|
datasets stats | Show summary statistics — entry counts per category, total entries, disk usage |
datasets export <fmt> | Export all data to a file (formats: json, csv, txt) |
datasets search <term> | Search all log files for a term (case-insensitive) |
datasets recent | Show last 20 entries from activity history |
datasets status | Health check — version, data directory, entry count, disk usage, last activity |
datasets help | Show available commands |
datasets version | Show version (v2.0.0) |
Data Storage
All data is stored locally at ~/.local/share/datasets/:
- Each data command writes to its own log file (e.g.,
ingest.log,transform.log) - Entries are stored as
timestamp|valuepairs (pipe-delimited) - All actions are tracked in
history.logwith timestamps - Export generates files in the data directory (
export.json,export.csv, orexport.txt)
Requirements
- Bash (with
set -euo pipefail) - Standard Unix utilities:
date,wc,du,grep,tail,cat,sed - No external dependencies or API keys required
When to Use
- To log and track data processing operations (ingest, transform, query, etc.)
- To maintain a searchable history of data pipeline activities
- To export accumulated records in JSON, CSV, or plain text format
- As part of larger automation or data-pipeline workflows
- When you need a lightweight, local-only dataset operation tracker
Examples
# Record a new ingest entry
datasets ingest "loaded training_data.csv 10000 rows"
# View recent transform entries
datasets transform
# Record a query
datasets query "filter by date > 2026-01-01"
# Search across all logs
datasets search "training"
# Export everything as JSON
datasets export json
# Check overall statistics
datasets stats
# View recent activity
datasets recent
# Health check
datasets status
Powered by BytesAgain | bytesagain.com | hello@bytesagain.com 💬 Feedback & Feature Requests: https://bytesagain.com/feedback
Files
2 totalSelect a file
Select a file to preview.
Comments
Loading comments…
