Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

xinwencaiji

v1.0.0

Run a self-contained Chinese and international AI news workflow inside the current workspace. Use when the user wants either high-frequency RSS capture only...

0· 193·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for nighmat1220/ai-news-collection.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "xinwencaiji" (nighmat1220/ai-news-collection) from ClawHub.
Skill page: https://clawhub.ai/nighmat1220/ai-news-collection
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install ai-news-collection

ClawHub CLI

Package manager switcher

npx clawhub@latest install ai-news-collection
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The skill's code and SKILL.md align with the stated purpose: collecting RSS/Atom feeds, deduplicating, producing cumulative Excel files and a Word brief, and optionally calling an AI model for titles/summaries. However, the registry metadata did not declare the external model credential (ARK_API_KEY) or model base (ARK_API_BASE) even though SKILL.md and scripts require them; this mismatch is unexpected and should be corrected.
Instruction Scope
Runtime instructions are narrowly scoped to the workspace (config, data, reports, state) and to running the bundled Python scripts. The scripts fetch arbitrary RSS/Atom URLs from user-provided config files and will POST content to an external model API for AI summarization. That behavior is coherent with the skill's purpose but means collected article text (and any credentials present in feed configs) will be transmitted externally.
Install Mechanism
There is no install spec in the registry (instruction-only). The bundled scripts list Python dependencies (openpyxl, python-docx) to be installed via pip — a low-risk, typical approach. No arbitrary binary downloads or obscure installers are present.
!
Credentials
The SKILL.md and scripts require ARK_API_KEY (and allow ARK_MODEL / ARK_API_BASE overrides) to call an external model service. The registry's required-env list is empty, which is inconsistent. Requesting an API key for the external model is proportionate to the AI-summary functionality, but the missing declaration and the default ARK_API_BASE (https://ark.cn-beijing.volces.com/api/v3) require you to verify the endpoint and trust the operator before supplying credentials.
Persistence & Privilege
The skill does not request always:true and does not modify other skills or system-wide settings. It writes state, logs, caches, reports, and snapshots into the workspace (data/, reports/, state/, logs/), which is normal for this type of tool.
What to consider before installing
This skill appears to do what it says (fetch RSS feeds, build Excel/Word reports, and optionally call an external model for summaries). Before installing or running it, consider the following: - The scripts will send collected article content to an external model endpoint using ARK_API_KEY / ARK_API_BASE. The registry metadata did not list these required env vars — verify the skill author and the endpoint before providing keys. - Default ARK_API_BASE points to an external service (https://ark.cn-beijing.volces.com). Confirm this is a trusted API and that your ARK_API_KEY is scoped appropriately. If you do not want outbound data sent, run with --disable-ai or omit ARK_API_KEY. - Feed configs can include authentication (username/password or custom headers). Those credentials may be used by the feed fetcher and stored under workspace state/logs; keep sensitive feeds out of the same workspace or review how credentials are provided. - The skill will create and write files/directories (data/, reports/, state/, logs/, snapshots/) in the chosen workspace. Back up or isolate any existing data you care about. - The package source is unknown and has no homepage; if you intend to run it in a production or sensitive environment, review the full script contents (they are bundled) and verify the model endpoint and data handling behavior. If you trust the code and endpoint: set ARK_API_KEY (and ARK_API_BASE/ARK_MODEL if needed), run dependency installation in an isolated environment, and consider running with --disable-ai first to validate data ingestion without external network calls.

Like a lobster shell, security has layers — review code before you run it.

latestvk973dk0a4109q02pep3fjvpked833vqn
193downloads
0stars
1versions
Updated 13h ago
v1.0.0
MIT-0

AI News Pipeline

Overview

This skill is executable by itself. The actual workflow scripts are bundled in scripts/. Run them against the current workspace or pass --workspace /path/to/workspace explicitly.

Workspace Requirements

The target workspace should contain or accept these files and folders:

  • config/sources.json
  • config/international_sources.json
  • companies.txt
  • data/
  • reports/
  • state/

If the folders do not exist, the scripts create them.

Install Dependencies

Install Python dependencies before first use:

python -m pip install -r /path/to/skill/scripts/requirements.txt

Available Entrypoints

Use the bundled Python entrypoints depending on the job type.

Capture Only

Use this for high-frequency collection jobs. It only captures feeds, updates deduplication state, and writes raw and incremental data.

python /path/to/skill/scripts/run_capture_only.py --workspace /path/to/workspace

Report Only

Use this for scheduled delivery jobs. It reads already-collected data, calls the model for summaries and titles, updates the cumulative Excel files, and rebuilds the Word brief.

By default it uses the reporting window from yesterday 00:00 to today 08:00.

python /path/to/skill/scripts/run_report_only.py --workspace /path/to/workspace

Optional time window:

python /path/to/skill/scripts/run_report_only.py --workspace /path/to/workspace --time-window "2026-03-15 00:00 to 2026-03-16 08:00"

Optional skip-AI mode:

python /path/to/skill/scripts/run_report_only.py --workspace /path/to/workspace --disable-ai

Full Workflow

python /path/to/skill/scripts/run_full_workflow.py --workspace /path/to/workspace

Optional time window:

python /path/to/skill/scripts/run_full_workflow.py --workspace /path/to/workspace --time-window "2026-03-15 00:00 to 2026-03-15 18:00"

Optional skip-AI mode:

python /path/to/skill/scripts/run_full_workflow.py --workspace /path/to/workspace --disable-ai

What Each Entrypoint Does

run_capture_only.py

  1. Collect domestic RSS items into data/YYYY-MM-DD.jsonl.
  2. Collect domestic raw items into data/domestic_raw_YYYY-MM-DD.jsonl.
  3. Collect international raw items into data/international_raw_YYYY-MM-DD.jsonl.
  4. Filter international items into data/international_YYYY-MM-DD.jsonl.
  5. Save per-source snapshots in snapshots/.
  6. Update RSS deduplication and source metrics in state/feed_state.json.

run_report_only.py

  1. Read the selected time window from collected data.
  2. Build the cumulative domestic Excel output in reports/company_mentions.xlsx.
  3. Build the cumulative international Excel output in reports/international_company_mentions.xlsx.
  4. Call the model to generate domestic AI titles and AI summaries.
  5. Call the model to generate international AI titles, AI summaries, and impact scores.
  6. Build a merged daily Word brief in reports/.

run_full_workflow.py

  1. Run capture.
  2. Run domestic reporting.
  3. Run international reporting.

Inputs

  • Domestic RSS config: config/sources.json
  • International RSS config: config/international_sources.json
  • Company list: companies.txt
  • Volcengine key: ARK_API_KEY
  • Optional model override: ARK_MODEL

Important Behavior

  • state/feed_state.json controls RSS deduplication.
  • Excel files are cumulative.
  • The Word brief is rebuilt per run.
  • The Word international section only includes the top 5 items by impact score inside the selected time window.
  • International items without a successful AI summary are excluded from the Word brief.
  • AI cache files are deleted automatically after each run.

Troubleshooting

  1. If the workflow does not rerun old RSS items, check state/feed_state.json.
  2. If AI columns are empty, check whether ARK_API_KEY is set in the execution environment.
  3. If the user wants a full rebuild, delete the relevant daily data files and state/feed_state.json, then rerun.
  4. If the user needs exact commands or cloud prompts, read references/commands.md.

References

  • references/commands.md

Comments

Loading comments...