Cloud Backup

v1.1.5

Back up and restore OpenClaw state. Creates local archives and uploads to S3-compatible cloud storage (AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOce...

4· 1.6k·13 current·13 all-time
byEvgeni Obuchowski@obuchowski

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for obuchowski/cloud-backup.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Cloud Backup" (obuchowski/cloud-backup) from ClawHub.
Skill page: https://clawhub.ai/obuchowski/cloud-backup
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required binaries: bash, tar, jq, aws
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Canonical install target

openclaw skills install obuchowski/cloud-backup

ClawHub CLI

Package manager switcher

npx clawhub@latest install cloud-backup
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description match the implementation. Required binaries (bash, tar, jq, aws) and the included script are appropriate for creating compressed archives, optionally encrypting them with GPG, and uploading via AWS-compatible CLI. The skill only operates on the OpenClaw state directory (OPENCLAW_STATE / ~/.openclaw) and its own config entries.
Instruction Scope
SKILL.md limits runtime actions to: run the included script, prompt the user about encryption, collect cloud provider credentials, write config entries for this skill via gateway config.patch, and optionally create a scheduled cron job. The instructions do not ask to read unrelated system files or exfiltrate data to unexpected endpoints beyond the configured S3-compatible providers.
Install Mechanism
There is no install spec (instruction-only style), and the only code shipped is the bash script. The script is executed directly from the skill bundle (no external downloads or arbitrary installers), which is the lower-risk pattern for this kind of utility.
Credentials
The skill needs access to S3-compatible credentials and an optional GPG passphrase to function. These are requested via OpenClaw config entries (skills.entries.cloud-backup.env.*) or via named AWS profile. This is expected, but worth noting: storing ACCESS_KEY_ID / SECRET_ACCESS_KEY and GPG_PASSPHRASE in the OpenClaw config means secrets may be persisted in plain text unless the environment/host protects that file. The SKILL.md explicitly warns backups contain secrets and prompts the user to enable encryption—this is appropriate but the storage of the passphrase in config is a sensitive choice the user should consider.
Persistence & Privilege
The skill may create a scheduled cron job that triggers an agentTurn payload (i.e., autonomous invocations). This is coherent with the backup use-case (scheduling regular backups) and the SKILL.md says to ask the user before scheduling, but users should be aware that the cron payload will cause the agent to run the skill autonomously at the specified times.
Assessment
What to consider before installing: - Functionality: This skill tars your OpenClaw state (~/.openclaw), so archives will include configuration, credentials, and any secrets stored there. That is the intended behavior for a backup tool. - Credentials: The skill expects S3-compatible credentials (ACCESS_KEY_ID / SECRET_ACCESS_KEY) or a named AWS profile. The recommended flow is to create a bucket-scoped, least-privilege key pair. Prefer a named profile or short-lived credentials where possible instead of storing long-lived secrets in config. - GPG passphrase: If you enable encryption, the SKILL.md suggests storing the GPG passphrase in the skill's config (env.GPG_PASSPHRASE) for non-interactive restores/cron. Storing passphrases in OpenClaw config is convenient but means the passphrase itself must be protected; consider using a secret manager or requiring interactive entry if you need stronger protection. - Scheduling/autonomy: The skill can create a cron job that triggers the agent to run backups automatically. Allow this only if you trust the agent and the scheduling action; review the schedule and cron payload before accepting. - Inspect the script: The full bash script is included in the package; review it if you want to verify details (files excluded, encryption flow, where files are written, exact S3 commands). It uses aws CLI (aws s3 cp/ls/rm), gpg for encryption, and standard tar/sha utilities. - Protect the OpenClaw config: Ensure ~/.openclaw/openclaw.json is filesystem-permission restricted (the references/security.md already recommends 600). Rotate keys if they are ever exposed. If those trade-offs are acceptable (i.e., you want automated backups of OpenClaw state and are prepared to manage credentials and encryption appropriately), the skill is coherent and reasonable to use.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

☁️ Clawdis
Binsbash, tar, jq, aws
latestvk971k6m212m86pv609490p7sv581ez07
1.6kdownloads
4stars
7versions
Updated 2mo ago
v1.1.5
MIT-0

OpenClaw Cloud Backup

Backs up OpenClaw state to a local archive and uploads it to cloud storage.

Follow the steps below. Inform the user about implicit defaults applied after execution.

Backup procedure (follow every step)

Step 1: Run the backup

bash "{baseDir}/scripts/cloud-backup.sh" backup full

Default mode is full. Use workspace, skills, or settings only when the user explicitly asks for a narrower backup scope.

Step 2: Check backup output for encryption warning

Look at the script output from Step 1. If it contains:

WARN: Encryption is disabled — backup archive will be stored in plaintext.

ask the user:

"Your backups are not encrypted. Archives contain config, credentials, and API keys in cleartext. Want to set a passphrase? (AES-256, just the passphrase needed to restore — no key files.)"

  • If user provides a passphrase → write both via gateway config.patch and tell the user what was saved:
    • skills.entries.cloud-backup.config.encrypt = true
    • skills.entries.cloud-backup.env.GPG_PASSPHRASE = "<passphrase>" Then re-run the backup so the archive is encrypted.
  • If user says no / skip → continue to Step 3.
  • If the warning is not present (encryption already enabled) → continue to Step 3.

Always execute this step and report the result. Backups may contain secrets — prompt every time encryption is off.

Step 3: Check script output for warnings

Look at the script output. If it contains WARN: Cloud storage is not configured, go to Step 4. Otherwise go to Step 5.

Step 4: Cloud not configured — ask user to set it up

Tell the user the backup was saved locally, then ask:

"Cloud storage isn't configured yet — backups are local only. Want to set up cloud upload? I support AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOcean Spaces, or any other S3-compatible service."

  • If user says yes → go to Cloud setup section below, then re-run the backup.
  • If user says no / local-only → set config.upload=false via gateway config.patch and confirm the change to the user. Done.

Always execute this step — offer cloud setup when it's missing and report what was configured.

Step 5: Report result and ensure scheduling

Report the backup paths from the script output to the user.

Then check if a daily cron job already exists for cloud-backup (cron action=list). If no cron job exists, create one with the default schedule and tell the user:

{
  "schedule": { "kind": "cron", "expr": "0 2 * * *" },
  "payload": { "kind": "agentTurn", "message": "Run cloud-backup: backup full" },
  "sessionTarget": "isolated"
}

Default: daily at 2 AM. Use a different schedule if it was explicitly asked for one.

Tell the user: "Scheduled daily backup at 2 AM." (or whatever time was set).

This step should be executed by default unless user asked not to do it. Report about the result — cron job created or failed to be created.


Cloud setup

When the user agrees to configure cloud storage:

  1. Ask which provider: AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOcean Spaces, or other.
  2. Read the matching provider guide from references/providers/ — it has exact config keys, endpoint format, and credential steps.
  3. Collect and write config via gateway config.patch — bucket, credentials, endpoint (if non-AWS).
  4. Run status to verify connectivity, then re-run backup.

Commands

bash "{baseDir}/scripts/cloud-backup.sh" <command>
CommandWhat it does
backup [full|workspace|skills|settings]Create archive + upload if configured. Default: full
listShow local + remote backups
restore <name> [--dry-run] [--yes]Restore from local or cloud. Always --dry-run first
cleanupPrune old archives (local: capped at 7; cloud: count + age)
statusShow current config and dependency check

Config reference

All in skills.entries.cloud-backup in OpenClaw config. Don't write defaults — the script handles them.

config.*

KeyDefaultDescription
bucketStorage bucket name (required for cloud)
regionus-east-1Region hint
endpoint(none)S3-compatible endpoint (required for non-AWS)
profile(none)Named AWS CLI profile (alternative to keys)
uploadtrueUpload to cloud after backup
encryptfalseGPG-encrypt archives
retentionCount10Cloud: keep N backups. Local: capped at 7
retentionDays30Cloud only: delete archives older than N days

env.*

KeyDescription
ACCESS_KEY_IDS3-compatible access key
SECRET_ACCESS_KEYS3-compatible secret key
SESSION_TOKENOptional temporary token
GPG_PASSPHRASEFor automated encryption/decryption

Provider guides

Read the relevant one only during setup:

  • references/providers/aws-s3.md
  • references/providers/cloudflare-r2.md
  • references/providers/backblaze-b2.md
  • references/providers/minio.md
  • references/providers/digitalocean-spaces.md
  • references/providers/other.md — any S3-compatible service

Security

See references/security.md for credential handling and troubleshooting.

Comments

Loading comments...