social-favorites-to-obsidian
ReviewAudited by ClawScan on May 16, 2026.
Overview
The skill’s goal is coherent, but it asks for sensitive login/session access and installs unpinned third-party scraping skills, so users should review it before installing.
Before installing, confirm you trust the package and the external hctec GitHub skills, review or pin those dependencies, use the least-sensitive accounts/sessions possible, keep cookiecloud.env and the data/vault folders private, and review any generated cron job before enabling automatic sync.
Findings (5)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
A change or compromise in the external repository could run code in the user’s agent environment and potentially access the social-platform session data used for scraping.
The skill installs unpinned third-party scraping skills from GitHub into the OpenClaw skills workspace; those dependencies are outside the provided review set and are later used for authenticated social-platform scraping.
这些技能不在 ClawHub。它们来自: https://github.com/hc-tec/my-collection-skills/tree/main/skills ... 脚本会从 GitHub 下载仓库,并把目录映射安装为:
Review the downloaded hctec skills before running them, pin to known commit hashes or trusted releases, and run the setup in an isolated environment with only the needed credentials.
The skill and its dependencies can use logged-in sessions to read private favorites and can write/sync to the selected Obsidian vault.
The workflow asks for CookieCloud credentials and may ask for Obsidian account and encryption credentials. This is expected for the integration, but it is sensitive account authority.
远程服务端需要地址、UUID 和 Password。... 官方 Obsidian Sync:需要 Obsidian 账号邮箱/用户名、登录密码或交互登录、远程 vault 名字或 ID,必要时还要端到端加密密码。
Prefer interactive login where possible, do not paste credentials into shared chats or logs, and inspect the local cookiecloud.env/config files after setup.
Private favorites, source URLs, images, and derived notes may remain on disk and may be uploaded through Obsidian Sync if that mode is enabled.
The skill persists scraped social-favorite data, sync state, and export state on disk, and may export it into an Obsidian vault that the user syncs elsewhere.
运行时数据保存在 `data_dir`: - `<platform>/raw/*.json`:抓取后的结构化原始数据 - `<platform>/state.json`:抓取状态 - `<platform>/export_state.json`:Obsidian 导出状态
Keep the config/data/vault directories private, exclude them from public repos or shared backups, and choose local-only mode if cloud sync is not desired.
Running the installer can install dependencies, start Docker services, write configuration, and execute setup scripts on the local machine.
The package wrapper executes the local Python installer. This is normal for an installer skill, but it means installing/running the package grants local command execution.
const result = spawnSync(python.command, [...python.args, installScript, ...args], {
cwd: process.cwd(),
stdio: "inherit"
});Run the installer only from a trusted package/source and review the prompts and commands before approving them.
Once enabled, scheduled tasks can keep accessing platform sessions and updating the Obsidian vault without a new manual command each time.
The script generates an enabled OpenClaw cron payload that will repeatedly run sync/export commands if the user adds it to the scheduler.
"payload": {
"kind": "agentTurn",
"message": f"Run this command and report only failures: {cmd}",
"timeoutSeconds": 3600,
},
"delivery": {"mode": "none"},
"enabled": TrueOnly add the generated cron jobs if ongoing automatic sync is desired, review the generated JSON command, and remove or disable the jobs when no longer needed.
