Security News Feed
ReviewAudited by ClawScan on May 10, 2026.
Overview
The skill mostly matches a security-news automation tool, but it needs review because it under-declares credentials and account authority, uses GLM/ZAI despite being described as Gemini-based, and includes an unscoped Notion test write.
Review the code and configuration before installing. Use a dedicated Notion integration with access only to the intended database, remove or fix the hard-coded Notion check script, confirm whether you want Gemini or GLM/ZAI, and do not enable hourly scheduling or Tistory publishing until the target accounts and caches are clearly scoped.
Findings (7)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
The skill may use account-level credentials and local profile settings that are not visible from the registry capability contract.
The code relies on Notion account credentials, LLM provider keys, and optional browser profile configuration even though the registry metadata declares no env vars or primary credential.
NOTION_API_TOKEN = os.getenv("NOTION_API_KEY") ... ZAI_API_KEY = os.getenv("SECURITY_NEWS_GLM_API_KEY") or os.getenv("GLM_API_KEY") ... CHROME_USER_DATA_DIR = os.getenv("CHROME_USER_DATA_DIR")Declare all required and optional credentials in metadata, use skill-scoped configuration, and require explicit user setup for any browser-profile or account-writing behavior.
If run with a token that has access, the script can write to a specific Notion database that may not be the user's intended target.
The connection-check script reads the user's global OpenClaw .env Notion token and attempts a Notion page creation against a hard-coded database ID rather than a user-selected database.
database_id = "fe8277a4484243db8b3b2f1a15399d40" ... env_file = Path.home() / '.openclaw' / 'workspace' / '.env' ... 'https://api.notion.com/v1/pages'
Remove hard-coded database IDs, require an explicit user-provided database ID, and show the target before any test page is created.
Users may think article content and API credentials are going to Gemini when the code may instead use a different LLM provider.
The implementation defaults to a GLM/ZAI endpoint, while the skill description repeatedly says it summarizes with the Gemini API.
# LLM API (ZAI) ... SECURITY_LLM_BASE_URL = os.getenv("GLM_BASE_URL", "https://api.z.ai/api/coding/paas/v4") ... SECURITY_LLM_MODEL = os.getenv("SECURITY_LLM_MODEL", "glm-4.7")Update the description and setup instructions to name the actual provider, or change the code to use Gemini as documented.
A malicious or malformed article could influence generated summaries or analysis that are saved automatically.
Untrusted crawled article content is passed into an LLM and the generated output is then published to Notion.
summary = summarize_text(article['content'], article['title']) ... details = details_text(article['content'], article['title']) ... pub_result = publisher.publish_article(
Treat crawled content as untrusted, add prompt-injection-resistant summarization prompts, and review outputs before enabling public blog publishing.
Once scheduled, it can continue crawling and publishing without a fresh prompt each time.
The skill documents hourly automatic execution through a scheduler.
**주기**: 1시간마다 자동 실행 ... trigger: type: "interval" hours: 1
Enable the scheduler only after confirming target databases, tokens, and posting settings; keep Tistory/public posting disabled unless intended.
Article titles, URLs, and related metadata may remain on disk after runs.
The skill persists Notion-derived URL/title data in a local SQLite cache for deduplication and keyword analysis.
SQLite-based caching layer for keyword statistics optimization. Caches Notion API responses ... def __init__(self, db_path: str = "data/url_cache.db")
Document cache retention, provide a cleanup command, and avoid storing more Notion content than needed.
Users may install code or dependencies from outside the reviewed registry artifact path.
The setup instructions rely on an external recursive Git clone and Python dependency installation despite the registry having no install spec.
git clone --recursive https://github.com/rebugui/OpenClaw.git ... pip install -r requirements.txt
Prefer a pinned install spec, publish dependency versions, and avoid recursive external setup unless the submodules are clearly documented.
