Last30Days Community Intelligence for OpenClaw
v1.0.0-openclaw.1OpenClaw adaptation of @mvanhorn's last30days skill. Research any topic from the last 30 days across Reddit, X, YouTube, TikTok, Instagram, Hacker News, Poly...
Security Scan
OpenClaw
Suspicious
medium confidencePurpose & Capability
The code and docs match the name/description: connectors for Reddit, X, YouTube, TikTok, Instagram, Hacker News, Polymarket and web search are present and used by the Python engine. The vendored Bird client for X (twitter/x) is included which fits the stated X support.
Instruction Scope
Runtime instructions and the shipped scripts instruct the agent/operator to run Python/Node scripts that will perform network scraping and persist results locally. The vendored Bird client reads browser cookies on macOS (explicitly documented). The code also creates and reads a secrets file (~/.openclaw/workspace/.secrets/last30days.env), writes DB and briefing files under ~/.openclaw/workspace/data, and the upstream README mentions auto-saving to ~/Documents/Last30Days/. These I/O and cookie-access actions involve sensitive local data and persistent storage beyond ephemeral runtime output.
Install Mechanism
No remote install spec or download URLs are present (instruction-only packaging + included source). The vendored bird-search client is included in the repository. Node 22+ is required for the vendored bird client but there is no remote fetch of arbitrary code at install time.
Credentials
The project uses multiple sensitive environment variables (SCRAPECREATORS_API_KEY, OPENAI_API_KEY, XAI_API_KEY, AUTH_TOKEN/CT0 fallback for X, PARALLEL_API_KEY/BRAVE_API_KEY/OPENROUTER_API_KEY, etc.) which are reasonable for the declared data sources, but the skill metadata declares no required env vars or primary credential. The omission of declared required secrets in the skill metadata is an inconsistency and an information-gap the user should notice. Additionally, the documented ability to read browser cookies (and prompt for Keychain access on macOS) is sensitive and should be considered before running.
Persistence & Privilege
The skill writes local SQLite DBs, briefings, logs, and a secrets file under the user's OpenClaw workspace (persistent storage). always:false (no forced always-on). The upstream docs mention an optional auto-save to ~/Documents/Last30Days/ (persistent user documents) — that behavior is potentially surprising and should be confirmed before use. The skill does not request system-wide privileges or modify other skills in the repository.
What to consider before installing
This repository implements the advertised multi-source research engine, but before installing or running it you should:
- Review the scripts yourself (especially scripts/lib/vendor/bird-search and scripts/setup_openclaw_env.sh) to confirm you accept cookie access and file-write behavior.
- Be aware the Bird client can read browser cookies on macOS (it may prompt for Keychain access); if you don't want that, avoid using the Bird path and provide service API keys instead or run with mocking (--mock).
- Expect to supply sensitive API keys (ScrapeCreators, OpenAI, xAI, Brave/Parallel/OpenRouter) if you want native scraping/search; store them securely (the code expects a .env-like secrets file) and set file permissions (chmod 600).
- Note the skill writes persistent data (SQLite DB, briefs, logs) into ~/.openclaw/workspace (and upstream docs mention saving to ~/Documents/Last30Days/) — if you don't want persistent archives, run with --no-store or inspect/modify scripts to change save locations.
- Because the skill metadata does not enumerate its environment/credential needs, do not rely solely on registry metadata — the code itself expects many env vars. If you want to reduce risk, run the skill in a sandbox/container or test with --mock to avoid network/cookie access.
If these behaviors (cookie reading, multiple API keys, persistent storage) are acceptable for your threat model and you reviewed the code, the skill is coherent with its purpose. If not, do not install or run it until you either remove the vendored cookie path or run it in an isolated environment.Like a lobster shell, security has layers — review code before you run it.
latest
last30days-openclaw
Attribution: This skill is an OpenClaw adaptation of @mvanhorn's MIT-licensed project: https://github.com/mvanhorn/last30days-skill.
What is original vs adapted
Original (from @mvanhorn)
- Core Python research engine (
scripts/last30days.py+scripts/lib/*) - Multi-source data collection and ranking logic
- Watchlist, briefing, and history database architecture
- Vendored
bird-searchX client and source connectors
OpenClaw adaptation (this folder)
- OpenClaw skill packaging (
skill.json, thisSKILL.md) - OpenClaw-first storage paths under
~/.openclaw/workspace - OpenClaw secrets file convention:
~/.openclaw/workspace/.secrets/last30days.env - OpenClaw cron helper:
scripts/openclaw_watchlist_run.sh - Setup helper for secrets:
scripts/setup_openclaw_env.sh
Runtime paths (OpenClaw defaults)
- Secrets:
~/.openclaw/workspace/.secrets/last30days.env - DB:
~/.openclaw/workspace/data/last30days/research.db - Briefings:
~/.openclaw/workspace/data/last30days/briefs/ - Output artifacts:
~/.openclaw/workspace/data/last30days/out/
Setup
cd ~/.openclaw/workspace/skills/last30days-openclaw
./scripts/setup_openclaw_env.sh
python3 scripts/last30days.py --diagnose
macOS X-cookie support (Bird)
The vendored Bird client reads browser cookies on macOS.
- Log into x.com in Safari/Chrome/Firefox
- Verify auth:
node scripts/lib/vendor/bird-search/bird-search.mjs --whoami
If that fails, set AUTH_TOKEN + CT0 in the secrets file.
Command routing
Use first token to route mode:
watch ...→ watchlist managementbriefing ...→ briefing generationhistory ...→ history/FTS queries- anything else → one-shot research
One-shot research (default mode)
Run via OpenClaw exec:
cd ~/.openclaw/workspace/skills/last30days-openclaw
python3 scripts/openclaw_run.py "TOPIC"
# equivalent engine call:
# python3 scripts/last30days.py "TOPIC" --emit=compact --no-native-web
- Use
--quickor--deepfor depth. - Use
--storeto persist findings. - Use
--search reddit,x,youtube,tiktok,instagram,hn,polymarket,webfor source subsets.
Watchlist mode
python3 scripts/watchlist.py add "TOPIC"
python3 scripts/watchlist.py list
python3 scripts/watchlist.py run-one "TOPIC"
python3 scripts/watchlist.py run-all
OpenClaw cron integration
Use this wrapper in a scheduled exec/cron job:
~/.openclaw/workspace/skills/last30days-openclaw/scripts/openclaw_watchlist_run.sh
This writes logs to:
~/.openclaw/workspace/logs/last30days-watchlist.log
Briefing mode
python3 scripts/briefing.py generate
python3 scripts/briefing.py generate --weekly
python3 scripts/briefing.py show --date YYYY-MM-DD
History mode
python3 scripts/store.py query "TOPIC" --since 7d
python3 scripts/store.py search "QUERY"
python3 scripts/store.py trending
python3 scripts/store.py stats
Notes
- If native web keys are absent, run with
--no-native-weband use OpenClaw'sweb_searchtool for web supplementation. - Preserve source weighting in synthesis: Reddit/X/YouTube/TikTok/Instagram/HN/Polymarket signals first, web second.
- Never remove attribution to @mvanhorn when republishing this adaptation.
Comments
Loading comments...
