Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

qqbot-daily-news-briefing

v1.0.0

Generates and delivers automated daily tech and finance news briefings with AI commentary via QQ, Telegram, or Discord using Baidu API or DuckDuckGo search.

0· 75·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for propn/qqbot-daily-news-briefing.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "qqbot-daily-news-briefing" (propn/qqbot-daily-news-briefing) from ClawHub.
Skill page: https://clawhub.ai/propn/qqbot-daily-news-briefing
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install qqbot-daily-news-briefing

ClawHub CLI

Package manager switcher

npx clawhub@latest install qqbot-daily-news-briefing
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
high confidence
!
Purpose & Capability
The skill's stated purpose (news aggregation + delivery) matches the scripts' behavior, but required capabilities are not declared. The code expects a Baidu search helper at {WORKSPACE}/skills/baidu-search/scripts/search.py (invoked via subprocess) even though the package/README/SKILL metadata do not declare this dependency. Scripts also assume OpenClaw CLI availability for delivery. The registry metadata lists no required env vars or credentials, yet the runtime expects BAIDU_API_KEY and target-user/channel settings. These mismatches mean the skill will fail or behave unexpectedly unless external dependencies and credentials are provided.
!
Instruction Scope
SKILL.md instructs adding environment variables (BAIDU_API_KEY, NEWS_TARGET_USER / QQ_TARGET_USER) and setting cron jobs and editing scripts — which is reasonable — but it does not document the required external baidu-search script path referenced at runtime. The instructions also direct writing/reading from system-wide locations (/etc/profile.d, /var/log, /root/.openclaw/workspace) and creating cron jobs; these have system-wide effects and require appropriate privileges. There is no instruction to install or verify the external 'baidu-search' skill or the OpenClaw CLI; the code will invoke those without checking for presence.
Install Mechanism
There is no install spec (instruction-only install) — the lowest install risk — and all code is included in the bundle. That reduces risk from arbitrary downloads. However, the scripts expect external artifacts (OpenClaw CLI and a separate baidu-search script under WORKSPACE/skills) which are not provided; this is an operational/consistency gap rather than a network-install risk.
!
Credentials
Registry metadata claims no required env vars or primary credential, but SKILL.md and code clearly use BAIDU_API_KEY and target-user variables (NEWS_TARGET_USER / QQ_TARGET_USER). The skill also writes logs to /var/log and stores files under /root/.openclaw/workspace, implying elevated privileges. Asking users to place API keys in system-wide /etc/profile.d or /etc/environment is more privileged than necessary and should be optional/explicit. The skill exposes hardcoded sample target IDs in scripts which should be removed or clearly documented.
Persistence & Privilege
always:false (normal). The skill's instructions encourage persistent scheduling via system cron and the delivery script (and news-deliver-direct.py) may schedule OpenClaw cron sessions automatically. That creates persistent scheduled behavior, which is consistent with its purpose. Still, because the scripts write to system paths (/var/log, /root workspace) and may schedule tasks, you should review and run them in a controlled environment (non-root or container) before deployment.
What to consider before installing
This skill is plausible for generating/delivering news, but several red flags need attention before installing: - Missing declared dependencies: The generator calls a separate Baidu helper at {WORKSPACE}/skills/baidu-search/scripts/search.py which is not included or mentioned as a required package; ask the author where that comes from or provide it yourself. - Undeclared environment variables: The registry lists none, but the code and SKILL.md require BAIDU_API_KEY and target-user envs. Treat any API key you set as sensitive. - Privileged file locations: Scripts default to /root/.openclaw/workspace and /var/log; they will likely fail or require root. Prefer running in a dedicated non-root user or container and update WORKSPACE and log paths accordingly. - Hardcoded sample target IDs are present in scripts; replace them with your own values before use and verify the target format. - Persistent scheduling: The skill instructs cron setup and may add OpenClaw cron sessions; verify scheduled jobs after installation and ensure you want the automatic daily deliveries. - OpenClaw CLI dependency: Delivery methods rely on the openclaw command. Confirm it is installed and configured and that the channels (qqbot/telegram/discord) are authorized. Recommended next steps before using: 1) Request or locate the missing 'baidu-search' helper and update the README/SKILL metadata to declare it as a dependency. 2) Run the scripts in a sandboxed, non-root environment, updating WORKSPACE and LOG paths to user-owned directories. 3) Remove/replace hardcoded target IDs. 4) Only export BAIDU_API_KEY if you trust the code and prefer per-user (not system-wide) env config. 5) If you cannot confirm the origin of the baidu-search helper or the author, treat the skill as untrusted and avoid giving it production credentials.

Like a lobster shell, security has layers — review code before you run it.

latestvk97fr1kdbrzq680pjp3y08wmks83rygh
75downloads
0stars
1versions
Updated 1mo ago
v1.0.0
MIT-0

Daily News Briefing Skill

Automated daily news briefing system that generates comprehensive tech and financial reports with AI-powered commentary and delivers them via configured messaging channels. Supports dual-search architecture with Baidu API (preferred when configured) and DuckDuckGo fallback (always available, no API key required).

Installation & Configuration

Step 1: Install via ClawHub

Option A: From ClawHub (Published)

# Search and install
clawhub search daily-news-briefing
clawhub install daily-news-briefing

# Verify installation
ls ~/.openclaw/skills/daily-news-briefing/

Option B: From Local Directory

# Copy skill to OpenClaw skills directory
cp -r /path/to/daily-news-briefing ~/.openclaw/skills/

# Verify structure
ls ~/.openclaw/skills/daily-news-briefing/scripts/

Step 2: Configure Environment Variables (Optional)

Note: The skill has a built-in fallback to DuckDuckGo if no Baidu API key is configured. Baidu API provides better structured data, but DuckDuckGo works perfectly fine without any API keys.

With Baidu API (Recommended for Better Results):

Create a configuration file with your Baidu Search API key:

Option A: System-wide (Recommended for servers)

# Create config file
sudo nano /etc/profile.d/daily-news-briefing.sh

# Add your configuration
export BAIDU_API_KEY="bce-v3/ALTAK-your-api-key-here"
export NEWS_TARGET_USER="9C12E02D9038B14FCEDCE1B69AAEAB3F"  # QQ user ID
export NEWS_CHANNEL="qqbot"  # qqbot, telegram, discord

# Reload configuration
source /etc/profile.d/daily-news-briefing.sh

Option B: User-specific

# Add to ~/.bashrc or ~/.zshrc
echo 'export BAIDU_API_KEY="your-api-key"' >> ~/.bashrc
echo 'export NEWS_TARGET_USER="target-user-id"' >> ~/.bashrc
source ~/.bashrc

Without API Key (Uses DuckDuckGo):

The skill will automatically use DuckDuckGo web search if no Baidu API key is configured:

# Just set target user and channel - that's it!
export NEWS_TARGET_USER="your-qq-user-id"
export NEWS_CHANNEL="qqbot"

Compare Search Methods:

FeatureBaidu APIDuckDuckGo
API Key Required✅ Yes (75 chars)❌ No
Result Quality🏆 Better structured data👍 Good for most use cases
Rate Limits⚠️ API quota applies✅ None
Setup Time~2 mins (get API key)0 mins (works out of box)
Content PreviewFull article snippetsTitle + URL only

Recommendation: Start with DuckDuckGo to test quickly, add Baidu API later if you need better content previews.

Step 3: Customize Delivery Settings (Optional)

Edit the delivery script to match your preferences:

nano ~/.openclaw/skills/daily-news-briefing/scripts/deliver-briefing.sh

Key settings to modify:

# Change target user ID
TARGET_USER="your-qq-user-id"  # Line ~10

# Change delivery channel (--channel parameter)
--channel qqbot      # QQ Bot (default)
--channel telegram   # Telegram
--channel discord    # Discord

Step 4: Set Up Cron Job for Automated Delivery

For daily delivery at 9:00 AM:

# Edit crontab
crontab -e

Add these lines (adjust paths if needed):

# Generate news at 9:00 AM
0 9 * * * source /etc/profile && cd ~/.openclaw/skills/daily-news-briefing/scripts && python3 generate-briefing.py >> /var/log/daily-news.log 2>&1

# Deliver at 9:01 AM  
1 9 * * * source /etc/profile && bash ~/.openclaw/skills/daily-news-briefing/scripts/deliver-briefing.sh >> /var/log/news-delivery.log 2>&1

Custom delivery times:

TimeCron Entry (Generate + Deliver)Use Case
7:00 AM0 7 * * * ...<br>1 7 * * * ...Early morning briefing
7:30 AM30 7 * * * ...<br>31 7 * * * ...After system fully up (recommended)
6:00 PM0 18 * * * ...<br>1 18 * * * ...Evening summary

Step 5: Test the Setup

Test API Key:

python3 -c "from generate_briefing import search_baidu; print(search_baidu('test', count=1))"

Generate News Manually:

cd ~/.openclaw/skills/daily-news-briefing/scripts
python3 generate-briefing.py

Check log: tail -20 /var/log/daily-news.log

Test Delivery:

bash deliver-briefing.sh

Check log: tail -30 /var/log/news-delivery.log

Verify Cron is Running:

# Check cron service
systemctl status cron

# View scheduled jobs
crontab -l

# Check last execution
grep "Starting news generation" /var/log/daily-news.log | tail -1

Step 6: Customize News Content (Optional)

Modify search queries in generate-briefing.py:

# Line ~85-90, customize keywords
china_tech = search_baidu('科技新闻 人工智能 芯片 AI 华为', count=3)
intl_tech = search_baidu('NVIDIA Broadcom Apple Microsoft AI', count=3)
china_finance = search_baidu('A 股 上证指数 港股 财经', count=3)
intl_finance = search_baidu('美股 纳斯达克 道琼斯 比特币', count=3)

Adjust article count: Change count=3 to 1-5 articles per category.

Customize AI commentary rules: See references/CONFIGURATION.md for pattern examples.

Components

Scripts

  • generate-briefing.py: Main news generation script. Fetches from 4 categories:

    • China Tech News (AI, chips, Huawei, etc.)
    • International Tech News (NVIDIA, Apple, Microsoft, etc.)
    • China Financial Markets (A-shares, HK stocks)
    • International Finance (US stocks, Fed, crypto)
  • deliver-briefing.sh: Delivery wrapper with multiple fallback strategies:

    • Method A: Direct OpenClaw CLI message send
    • Method B: Python delivery script
    • Method C: Create notification file
  • news-deliver-direct.py: Alternative Python-based delivery with headline extraction

References

See references/ for:

  • CONFIGURATION.md: Detailed setup guide and customization options
  • API_REFERENCE.md: Baidu API integration details and search query examples
  • TEMPLATE_EXAMPLES.md: Sample briefing outputs and markdown templates

Customization

Change Delivery Time

Edit crontab lines (format: minute hour day month weekday):

# 7:30 AM delivery
30 7 * * * ...
31 7 * * * ...

# 6:00 AM delivery  
0 6 * * * ...
1 6 * * * ...

Modify Search Queries

Edit queries in generate-briefing.py:

china_tech = search_baidu('科技新闻 人工智能 芯片 AI 华为', count=3)
intl_tech = search_baidu('NVIDIA Broadcom Apple Microsoft AI', count=3)

Add Custom Commentary Rules

See references/API_REFERENCE.md for commentary rule patterns.

Troubleshooting

No results from Baidu API:

  • Verify BAIDU_API_KEY is set and valid (length > 20 chars)
  • Check /var/log/daily-news.log for detailed errors

Delivery fails:

  • Check /var/log/news-delivery.log
  • Verify OpenClaw CLI is installed: openclaw --version
  • Confirm target user ID is correct in script config

Wrong headlines extracted:

  • Script supports both Chinese and English headers
  • Ensure markdown file has proper structure with ## 🖥️ and ## 📈 sections

Multi-Channel Delivery Configuration

Supported Channels

The skill supports any OpenClaw-configured messaging channel:

ChannelTarget FormatExample ValueNotes
QQBotc2c:USERIDc2c:9C12E02D...Default, for private QQ messages
Telegram@username or chat ID@mychannel or 123456789Bot must be added to channel
Discordguild-id/channel-id1234567890/9876543210Bot needs permissions
Slack#channel-name or @user#general or U123456Slack app configured

Configure Channel in Script

Method 1: Environment Variable

export NEWS_CHANNEL="telegram"  # qqbot, telegram, discord, slack
export NEWS_TARGET_USER="@mychannel"  # Channel-specific target

Method 2: Edit deliver-briefing.sh

Find line ~50 and modify:

--channel qqbot      # Change to your preferred channel
-t "qqbot:c2c:${TARGET_USER}"  # Update target format per channel

Channel-Specific Setup

For Telegram:

  1. Create a bot via @BotFather
  2. Add bot to channel/group
  3. Get chat ID using openclaw status
  4. Set NEWS_TARGET_USER="@channelname" or numeric ID

For Discord:

  1. Create application in Discord Developer Portal
  2. Invite bot to server with proper permissions
  3. Use format: guild-id/channel-id
  4. Test with: openclaw message send --channel discord -t "GUILD/CHANNEL" -m "Test"

For QQBot:

  • Ensure QQ Bot is configured in OpenClaw
  • Use c2c format for private messages
  • Target user ID available from chat metadata

Output Format

Generated briefing includes:

  • Date and timestamp header
  • 4 news categories with 3 articles each (customizable)
  • AI-powered Jarvis commentary for each article
  • Source attribution and read-more links
  • File size: typically 10-15KB

Delivery message includes:

  • Preview with first headlines from tech & finance sections
  • <qqfile> attachment tag for full markdown report
  • Footer with next delivery time notice

Logs & Monitoring

Log FileLocationPurpose
News generation/var/log/daily-news.logSearch results, article counts, errors
Delivery attempts/var/log/news-delivery.logSend status, fallback methods used

View recent logs:

# Today's news generation
tail -50 /var/log/daily-news.log

# Latest delivery result
grep "completed\|ERROR" /var/log/news-delivery.log | tail -10

Troubleshooting

No results from Baidu API:

  • Verify BAIDU_API_KEY is set and valid (length > 20 chars)
  • Check /var/log/daily-news.log for "Found X results" messages
  • Test: python3 -c "from generate_briefing import search_baidu; print(search_baidu('test', count=1))"
  • Fallback: If Baidu API fails, script automatically uses DuckDuckGo (look for "falling back to DuckDuckGo" in logs)

No results from any source:

  • Check network connectivity: ping -c 2 duckduckgo.com
  • Verify firewall isn't blocking outbound HTTP/HTTPS
  • Test DuckDuckGo directly: curl -s "https://html.duckduckgo.com/html/?q=test"
  • Review /var/log/daily-news.log for detailed error messages

Delivery fails:

  • Check /var/log/news-delivery.log for specific error
  • Verify OpenClaw CLI is installed: openclaw --version
  • Confirm target user/channel ID is correct in script config
  • Test manual delivery: bash deliver-briefing.sh

Wrong headlines extracted:

  • Script supports both Chinese (## 🖥️ 中国科技新闻) and English headers
  • Ensure markdown file has proper structure with section markers
  • Check log shows file size > 1KB (not empty fallback)
  • DuckDuckGo results may have longer titles - this is normal

Cron job not running:

  • Verify cron service: systemctl status cron
  • Check crontab entries: crontab -l
  • Test cron timing: wait for scheduled time or adjust to test immediately
  • Check log rotation: old logs may be archived

Check which search method is being used:

# View log file
grep "Search method used" /var/log/daily-news.log | tail -1

# Or check footer of generated news file
tail -5 ~/daily-news-$(date +%Y%m%d).md

Expected output: 📊 Search method used: baidu or 📊 Search method used: duckduckgo

Notes

  • Script auto-generates filename with current date: daily-news-YYYYMMDD.md
  • Logs stored in /var/log/daily-news.log and /var/log/news-delivery.log
  • Fallback content generated if search returns no results (graceful degradation)
  • Supports multiple messaging channels via OpenClaw CLI
  • All credentials managed via environment variables (no hardcoded secrets)

Version: 1.0
Last Updated: 2026-03-28
Maintained by: Jarvis AI Assistant

Comments

Loading comments...