Clawfeed Digest

v1.0.3

Fetch ClawFeed AI news digests (4h/daily/weekly) and save them automatically to a specified Obsidian directory with flexible CLI options.

1· 455·2 current·3 all-time
byAnonymous@adminlove520
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description claim to fetch ClawFeed digests and save them to Obsidian; the included script fetches from https://clawfeed.kevinhe.io/api/digests and writes markdown files into an Obsidian directory. No unrelated credentials, binaries, or config paths are requested.
Instruction Scope
SKILL.md tells the agent to pip install requests and run scripts/fetch_clawfeed.py (matching the provided script). The script performs only an unauthenticated GET to the stated API and writes files to a user vault path (default: ~/OneDrive/文档/Obsidian Vault/AI新闻). Note: writing into a OneDrive-synced vault means the results will be uploaded to the user's cloud storage — this is expected but worth awareness.
Install Mechanism
No install spec is provided; the skill is instruction-only and only recommends installing the widely used 'requests' Python package. No downloads from arbitrary URLs or archive extraction are present in the skill itself.
Credentials
The skill requests no environment variables, no credentials, and does not access other config paths. Its behavior (network GET and filesystem writes) is proportionate to the declared purpose.
Persistence & Privilege
The skill is not always-enabled and is user-invocable; it does not request elevated privileges or modify other skills' configurations. The provided cron example runs the script on a schedule — scheduling is appropriate for the stated functionality.
Assessment
This skill appears to do exactly what it says: fetch public ClawFeed digests and write them as markdown files to an Obsidian vault. Before installing or scheduling it, consider these practical points: - Verify output location: by default it writes to ~/OneDrive/文档/Obsidian Vault/AI新闻. If you don't want results synced to OneDrive/cloud, supply --output to a local-only folder. - Run manually first: execute python scripts/fetch_clawfeed.py locally to confirm the content and filenames meet your expectations before adding a cron job. - Review and trust source: the package metadata has no homepage; docs point to a GitHub repo (adminlove520/clawfeed-digest). If provenance matters, inspect that upstream repo for updates or tampering. - Data handling: the script will overwrite files with the same generated filename; back up any important notes you might overwrite. - Environment hygiene: install 'requests' in a virtualenv rather than system-wide if you prefer isolation. Overall there are no code-level signs of credential harvesting, unexpected network endpoints, or privileged operations, so the skill is internally consistent with its stated purpose.

Like a lobster shell, security has layers — review code before you run it.

latestvk977s0j61gtmsghx1a9hb96tsx828az9
455downloads
1stars
4versions
Updated 1mo ago
v1.0.3
MIT-0

ClawFeed Digest Fetcher

抓取 ClawFeed AI 新闻简报,写入 Obsidian 知识库

功能

  • 抓取 ClawFeed 简报(4h/日报/周报)
  • 自动写入 Obsidian 指定目录

使用方法

# 安装依赖
pip install requests

# 获取今日日报
python scripts/fetch_clawfeed.py

# 获取 4h 简报
python scripts/fetch_clawfeed.py -t 4h

参数

  • --type, -t: 简报类型 (4h, daily, weekly)
  • --limit, -l: 获取数量
  • --output, -out: 输出目录

数据来源

OpenClaw 定时任务

{
  "name": "每日 AI 新闻简报",
  "schedule": "0 17 * * *",
  "payload": {
    "message": "运行 python ~/.openclaw/skills/clawfeed-digest/scripts/fetch_clawfeed.py"
  }
}

Comments

Loading comments...