Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Supermemory Free

v1.0.0

Cloud knowledge backup and retrieval using Supermemory.ai free tier. Store high-value insights to the cloud and search them back when local memory is insuffi...

0· 753·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for broedkrummen/supermemory-free.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Supermemory Free" (broedkrummen/supermemory-free) from ClawHub.
Skill page: https://clawhub.ai/broedkrummen/supermemory-free
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Canonical install target

openclaw skills install broedkrummen/supermemory-free

ClawHub CLI

Package manager switcher

npx clawhub@latest install supermemory-free
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
!
Purpose & Capability
The code and SKILL.md clearly require a SUPERMEMORY_OPENCLAW_API_KEY and perform POSTs to https://api.supermemory.ai (store/search), which aligns with the described purpose. However, the registry summary at the top of this report incorrectly lists "Required env vars: none" while _meta.json and SKILL.md/CLI code declare SUPERMEMORY_OPENCLAW_API_KEY as required — this metadata mismatch is an incoherence that could mislead users about required secrets.
!
Instruction Scope
The auto_capture.py scans local session memory files in WORKSPACE/memory (e.g., memory/YYYY-MM-DD.md), extracts candidate 'high-value' lines, and uploads them. The SKILL.md and code instruct installing a cron job that runs daily and sources .env. While there are skip patterns to avoid storing obvious passwords/tokens, the heuristics are fallible (and the code will happily upload paths, config lines, error traces, API endpoints, etc.). This means sensitive information present in memory logs could be uploaded unintentionally.
Install Mechanism
There is no remote install/download; the skill is instruction-plus-scripts. The provided install_cron.sh modifies the user's crontab to run the auto-capture daily and creates log files — expected behavior for an auto-capture feature. No external archives or unknown URLs are fetched during install.
!
Credentials
The only secret the code needs is SUPERMEMORY_OPENCLAW_API_KEY, which is proportionate to the stated function. However, the code searches for .env in multiple directories (workspace paths, relative skill paths, and home) which increases the chance it will pick up keys from unexpected locations. Additionally, the registry metadata shown to users omitted the required env var, which is a misleading inconsistency.
Persistence & Privilege
The skill does not request always:true and is not force-installed, but the optional install_cron.sh will create a persistent cron job (daily at 02:00 UTC) that runs autonomously and uploads content when present. If the cron is installed, the skill will have persistent periodic network access to upload extracted lines.
What to consider before installing
This skill implements the advertised cloud backup/search behavior, but take care before installing or enabling auto-capture: 1) Confirm and set SUPERMEMORY_OPENCLAW_API_KEY in the intended .env (the registry view you saw omitted it — that’s an inconsistency). 2) Review your memory/ logs (memory/YYYY-MM-DD.md) for any sensitive secrets, credentials, full tokens, or paths you do not want uploaded; the script's heuristics may miss them. 3) Use the auto-capture dry-run (--dry-run) first to see what would be uploaded. 4) If you don't want persistent uploads, do not run install_cron.sh (or inspect/modify the cron command to remove 'source .env' or restrict scope). 5) If you install the cron, monitor the created log file and the .capture_state.json dedup file; consider limiting file permissions. 6) Prefer manual store/search usage if you cannot guarantee that session memory is free of sensitive data. If you want a cleaner metadata view, ask the publisher to correct the registry metadata to list the required env var(s).

Like a lobster shell, security has layers — review code before you run it.

latestvk97a9w9xma89sh89rvrcz2nfd9817rb1
753downloads
0stars
1versions
Updated 3h ago
v1.0.0
MIT-0

Supermemory Free — Cloud Knowledge Backup

Backs up important knowledge and insights to Supermemory.ai's cloud using the free tier API.
Uses only /v3/documents (store) and /v3/search (retrieve) — no Pro-only endpoints.

Prerequisites

Set in .env

SUPERMEMORY_OPENCLAW_API_KEY="sm_..."

Tools

supermemory_cloud_store

Store a knowledge string to the cloud.

python3 skills/supermemory-free/store.py "Your knowledge string here"

# With optional container tag (namespace/filter)
python3 skills/supermemory-free/store.py "knowledge string" --tag openclaw

# With metadata
python3 skills/supermemory-free/store.py "knowledge string" --tag fixes --source "session"

# Output raw JSON
python3 skills/supermemory-free/store.py "knowledge string" --json

When to use:

  • User asks to "remember" something permanently
  • Important configuration/setup knowledge
  • Resolved problems / solutions discovered
  • Key facts you want cross-session persistence for

supermemory_cloud_search

Search the cloud memory for relevant knowledge.

python3 skills/supermemory-free/search.py "your query"

# With container tag filter
python3 skills/supermemory-free/search.py "your query" --tag openclaw

# More results
python3 skills/supermemory-free/search.py "your query" --limit 10

# Higher precision (less noise)
python3 skills/supermemory-free/search.py "your query" --threshold 0.7

# Search across ALL tags
python3 skills/supermemory-free/search.py "your query" --no-tag

When to use:

  • Local memory (MEMORY.md, daily logs) doesn't have the answer
  • User references something from "a long time ago"
  • Cross-session knowledge lookup
  • "Do you remember when..." queries

Auto-Capture (Cron)

Scans recent session memory logs and automatically pushes high-value insights to Supermemory cloud.

# Run manually
python3 skills/supermemory-free/auto_capture.py

# Dry run (show what would be captured, no upload)
python3 skills/supermemory-free/auto_capture.py --dry-run

# Scan last N days (default: 3)
python3 skills/supermemory-free/auto_capture.py --days 7

# Force re-upload even if already seen
python3 skills/supermemory-free/auto_capture.py --force

# Verbose mode
python3 skills/supermemory-free/auto_capture.py --verbose

Install cron job (runs daily at 2:00 AM UTC):

bash skills/supermemory-free/install_cron.sh

Remove cron job:

bash skills/supermemory-free/install_cron.sh --remove

Check cron status:

bash skills/supermemory-free/install_cron.sh --status

What Gets Auto-Captured

The auto-capture script identifies "high-value" insights from memory logs using these heuristics:

PatternLabelExample
Resolved errors / fixesfixFixed: SSL cert error by running...
Error contexterrorException: Connection refused on port 5432
Configuration pathsconfig/etc/nginx/sites-available/default
API/endpoint infoapiEndpoint: POST /v3/documents for storage
User preferencespreferenceUser prefers Python over Node for scripts
Decisions madedecisionDecided to use PostgreSQL because...
Learned factsinsightLearned that cron syntax for...
Installs / setupsetupInstalled nginx, configured with...
Bullet-point blocksbullet- Key finding: X works better than Y

Deduplication: Already-uploaded items are tracked in .capture_state.json — re-running is safe.


Container Tags

Use --tag to namespace your memories:

TagPurpose
openclawGeneral OpenClaw session knowledge (default)
fixesBug fixes and solutions
configConfiguration and setup
user-prefsUser preferences
projectsProject-specific knowledge

Files

FilePurpose
store.pyCLI tool: upload knowledge to cloud
search.pyCLI tool: search cloud knowledge
auto_capture.pyCron script: auto-analyze memory logs
install_cron.shInstall/remove/status of cron job
.capture_state.jsonDedup state (auto-generated, gitignore)
SKILL.mdThis file
_meta.jsonSkill metadata

API Info

  • Base URL: https://api.supermemory.ai
  • Store endpoint: POST /v3/documents
  • Search endpoint: POST /v3/search
  • Auth: Bearer token from SUPERMEMORY_OPENCLAW_API_KEY
  • Free tier limits: Check https://console.supermemory.ai for current quotas
  • Note: Cloudflare-compatible headers included — avoids 1010 access denial errors

Troubleshooting

HTTP 403 / 1010 Access Denied:
The scripts include proper User-Agent, Origin, and Referer headers to satisfy Cloudflare. If it recurs, verify the API key is valid at https://console.supermemory.ai.

No memory files found:
Auto-capture looks in memory/YYYY-MM-DD.md. Ensure your memory skill is writing daily logs there.

Re-upload everything:
Delete .capture_state.json or use --force to ignore the dedup state.

Comments

Loading comments...