Install
openclaw skills install bigdataSplit large files, run parallel processing, and stream batch analysis. Use when sampling datasets, aggregating logs, or transforming bulk data.
openclaw skills install bigdataA comprehensive data processing toolkit for ingesting, transforming, querying, filtering, aggregating, and managing data workflows — all from the command line with local timestamped log storage.
| Command | Description |
|---|---|
bigdata ingest <input> | Ingest raw data into the system. Without args, shows recent ingest entries |
bigdata transform <input> | Record a data transformation step. Without args, shows recent transforms |
bigdata query <input> | Log and track data queries. Without args, shows recent queries |
bigdata filter <input> | Apply and record data filters. Without args, shows recent filters |
bigdata aggregate <input> | Record aggregation operations. Without args, shows recent aggregations |
bigdata visualize <input> | Log visualization tasks. Without args, shows recent visualizations |
bigdata export <input> | Log export operations. Without args, shows recent exports |
bigdata sample <input> | Record data sampling operations. Without args, shows recent samples |
bigdata schema <input> | Track schema definitions and changes. Without args, shows recent schemas |
bigdata validate <input> | Log data validation checks. Without args, shows recent validations |
bigdata pipeline <input> | Record pipeline configurations. Without args, shows recent pipelines |
bigdata profile <input> | Log data profiling operations. Without args, shows recent profiles |
bigdata stats | Show summary statistics across all entry types |
bigdata search <term> | Search across all log entries for a keyword |
bigdata recent | Show the 20 most recent activity entries from the history log |
bigdata status | Health check — version, data dir, total entries, disk usage, last activity |
bigdata help | Show all available commands |
bigdata version | Print version (v2.0.0) |
Each data command (ingest, transform, query, etc.) works the same way:
.log file and records it in the activity historyAll data is stored locally in plain-text log files:
~/.local/share/bigdata/
├── ingest.log # Ingested data entries
├── transform.log # Transformation records
├── query.log # Query log
├── filter.log # Filter operations
├── aggregate.log # Aggregation records
├── visualize.log # Visualization tasks
├── export.log # Export operations
├── sample.log # Sampling records
├── schema.log # Schema definitions
├── validate.log # Validation checks
├── pipeline.log # Pipeline configurations
├── profile.log # Profiling results
└── history.log # Unified activity log with timestamps
Each entry is stored as YYYY-MM-DD HH:MM|<value> for easy parsing and export.
set -euo pipefail)date, wc, du, grep, head, tail, cat# Ingest raw data
bigdata ingest "customer_orders_2024.csv — 1.2M rows loaded"
# Transform it
bigdata transform "normalize dates to ISO-8601, trim whitespace, deduplicate"
# Validate the output
bigdata validate "all required fields present, no nulls in customer_id"
# Record the schema
bigdata schema "orders: id(int), customer_id(int), amount(decimal), date(date)"
# Export when ready
bigdata export "final dataset pushed to analytics warehouse"
# Search across all logs for a keyword
bigdata search "customer"
# Check overall statistics
bigdata stats
# View recent activity across all commands
bigdata recent
# Health check
bigdata status
# Define a pipeline
bigdata pipeline "daily-etl: ingest → clean → validate → load — runs at 02:00 UTC"
# Profile a dataset
bigdata profile "users table: 500K rows, 12 columns, 0.3% nulls in email field"
# Sample data for testing
bigdata sample "random 10% sample from transactions for QA testing"
# Record an aggregation
bigdata aggregate "monthly revenue by region — Q1 totals computed"
# Log a filter operation
bigdata filter "removed records older than 2020-01-01, kept 850K of 1.2M rows"
# Track a query
bigdata query "SELECT region, SUM(revenue) FROM orders GROUP BY region"
# Log a visualization
bigdata visualize "bar chart: monthly revenue trend, exported as PNG"
All commands print confirmation to stdout. Data is persisted in ~/.local/share/bigdata/. Use bigdata stats for a summary or bigdata search <term> to find specific entries across all logs.
Powered by BytesAgain | bytesagain.com | hello@bytesagain.com