AI News Aggregator

v1.2.8

Fetches AI & tech news (default) or any custom topic (crypto, geopolitics, etc.) from RSS feeds, Tavily search, Twitter/X, and YouTube. Writes an English edi...

2· 441·1 current·1 all-time
byScott Lai@scottll
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description match what the package does: fetch RSS/Tavily/Twitter/YouTube, call one AI provider (OpenAI/DeepSeek/Anthropic), and post a formatted digest to a Discord webhook. The required primary credential (OPENAI_API_KEY by default) and DISCORD_WEBHOOK_URL are appropriate for that purpose.
Instruction Scope
SKILL.md and the script instruct the agent to run the bundled Python script (via 'uv run') which fetches remote feeds/APIs and posts to Discord. That is within scope. Minor issues: SKILL.md and code reference a DISCORD_WEBHOOK_URL_TRENDING env var (the script falls back to DISCORD_WEBHOOK_URL), and SKILL.md warns not to 'search the web manually' — the script itself performs HTTP requests to many endpoints (expected).
Install Mechanism
No install spec is provided and the skill is instruction-only with a bundled Python script. The script leverages 'uv' to run and install listed Python deps (PEP 723 header). Requiring 'uv' is reasonable here; no remote arbitrary download/install URLs or extract steps are present.
Credentials
Requested envs are proportional: DISCORD_WEBHOOK_URL (required) and OPENAI_API_KEY (primary) make sense. Optional keys (DEEPSEEK, ANTHROPIC, TAVILY, TWITTERAPI_IO, YOUTUBE) are reasonable for optional features. Minor inconsistencies: OPENAI_API_KEY appears both as primary and again under optionalEnv in SKILL.md, and DISCORD_WEBHOOK_URL_TRENDING is read by the script but not declared in requires.env.
Persistence & Privilege
always:false (default) and there is no install-time modification of other skills or global agent config. The skill does network I/O only when run; it does not request elevated or persistent platform privileges.
Assessment
This skill appears to do what it says: collect news, summarise via one chosen AI provider, and post to a Discord webhook. Before installing or running it: 1) Be prepared to provide a Discord webhook URL and (by default) your OpenAI API key — these are required for normal operation. Only supply optional service keys (Tavily, Twitter, YouTube, DeepSeek, Anthropic) if you need those sources. 2) The script will make outbound requests to AI provider endpoints and the listed news APIs and then POST to your Discord webhook — verify the webhook points to a channel you control (avoid using webhooks that forward to broad/audit-sensitive channels). 3) The skill uses 'uv' to run and install Python dependencies; follow the project's docs to install 'uv' rather than piping remote install scripts. 4) Minor documentation inconsistencies exist (duplicate OPENAI listing; DISCORD_WEBHOOK_URL_TRENDING used but not declared) — if these matter for your deployment, inspect or run the script in dry-run mode first. 5) If you do not trust the external AI endpoints or the Discord target, do not provide credentials or webhook. Overall the package is internally consistent with low surprise, but always review/try a dry-run before enabling in production.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

🦞 Clawdis
OSLinux · macOS · Windows
Any binuv
EnvDISCORD_WEBHOOK_URL
Primary envOPENAI_API_KEY
aivk974y5vym0gt1g589j73zhha4s828e4vchinesevk974y5vym0gt1g589j73zhha4s828e4vdeepseekvk974y5vym0gt1g589j73zhha4s828e4vdiscordvk974y5vym0gt1g589j73zhha4s828e4vlatestvk971yap5qy20mvw64ej5yenp5n831yxanewsvk974y5vym0gt1g589j73zhha4s828e4vrssvk974y5vym0gt1g589j73zhha4s828e4vyoutubevk974y5vym0gt1g589j73zhha4s828e4v
441downloads
2stars
13versions
Updated 1mo ago
v1.2.8
MIT-0
Linux, macOS, Windows

🦞 AI News Aggregator

Collects news on any topic, writes an English editorial digest using your choice of AI provider, and posts it to Discord.

Default (AI topic): TechCrunch · The Verge · NYT Tech (RSS) + curated AI YouTube channels Custom topics: Tavily news search + YouTube topic search (no Shorts, sorted by views) AI providers: OpenAI (default) · DeepSeek · Anthropic Claude — switchable per request


Network Endpoints

EndpointPurposeCondition
https://api.openai.com/v1/chat/completionsAI editorial summarisationOnly if provider=openai (default)
https://api.deepseek.com/chat/completionsAI editorial summarisationOnly if provider=deepseek
https://api.anthropic.com/v1/messagesAI editorial summarisationOnly if provider=claude
https://discord.com/api/webhooks/...Post digest to DiscordAlways (required)
https://techcrunch.com/.../feed/RSS news (AI topic)Default AI topic only
https://www.theverge.com/rss/...RSS news (AI topic)Default AI topic only
https://www.nytimes.com/svc/collections/...RSS news (AI topic)Default AI topic only
https://api.tavily.com/searchCustom topic news searchOnly if TAVILY_API_KEY set
https://api.twitterapi.io/twitter/tweet/advanced_searchTwitter searchOnly if TWITTERAPI_IO_KEY set
https://www.googleapis.com/youtube/v3/...YouTube searchOnly if YOUTUBE_API_KEY set

Exactly one AI endpoint is contacted per run, determined by the active provider. The default provider is OpenAI (OPENAI_API_KEY required). Switch providers with --provider deepseek or --provider claude.


Usage Examples

  • "Get today's AI news"
  • "Collect news about crypto"
  • "Last week's news about climate change"
  • "What's trending in AI today?"
  • "Get crypto news from the last 3 days using OpenAI"
  • "Show me recent Bitcoin YouTube videos"
  • "Summarise WWIII news with Claude"
  • "AI news using GPT-4o"
  • "AI news dry run" (preview without posting to Discord)
  • "Test my Discord webhook"

API Keys

KeyRequiredWhere to get it
DISCORD_WEBHOOK_URL✅ AlwaysDiscord → Channel Settings → Integrations → Webhooks → Copy URL
OPENAI_API_KEYIf using OpenAI (default)platform.openai.com/api-keys
DEEPSEEK_API_KEYIf using DeepSeekplatform.deepseek.com/api_keys
ANTHROPIC_API_KEYIf using Claudeconsole.anthropic.com → API Keys
TAVILY_API_KEYFor custom topicsapp.tavily.com
TWITTERAPI_IO_KEYOptionaltwitterapi.io
YOUTUBE_API_KEYOptionalconsole.cloud.google.com → YouTube Data API v3

AI Providers & Models

Provider--provider valueDefault modelBest for
OpenAIopenai (default)gpt-4o-miniQuality, reliability
DeepSeekdeepseekdeepseek-chatCost-effective, fast
Claudeclaudeclaude-3-5-haiku-20241022Nuanced writing

Override per request using the --provider flag. Set a permanent non-default with openclaw config set env.AI_PROVIDER '"deepseek"'. Override the model with --model (e.g. --model gpt-4o or --model claude-3-5-sonnet-20241022).


Implementation

IMPORTANT: Always run news_aggregator.py using the steps below. Do NOT search the web manually or improvise a response — the script handles all fetching, summarisation, and Discord posting.

Step 1 — Locate the script

The script is bundled with this skill. Find it:

SKILL_DIR=$(ls -d ~/.openclaw/skills/ai-news-aggregator-sl 2>/dev/null || ls -d ~/.openclaw/skills/news-aggregator 2>/dev/null)
SCRIPT="$SKILL_DIR/news_aggregator.py"
echo "Script: $SCRIPT"
ls "$SCRIPT"

Step 2 — Check uv is available

which uv && uv --version || echo "uv not found"

If uv is not found, ask the user to install it from their system package manager or from https://docs.astral.sh/uv/getting-started/installation/. Do not run a curl-pipe-sh command on the user's behalf.

Step 3 — API keys

Env vars are passed automatically by OpenClaw from its config. No .env file is needed.

Verify the required keys are set (without revealing values):

[[ -n "$OPENAI_API_KEY" ]]      && echo "OPENAI_API_KEY: set"      || echo "OPENAI_API_KEY: MISSING (required for default provider)"
[[ -n "$DISCORD_WEBHOOK_URL" ]] && echo "DISCORD_WEBHOOK_URL: set" || echo "DISCORD_WEBHOOK_URL: MISSING"

If any are missing, ask the user to register them:

openclaw config set env.OPENAI_API_KEY '<key>'
openclaw config set env.DISCORD_WEBHOOK_URL '<url>'
# Optional alternatives:
openclaw config set env.DEEPSEEK_API_KEY '<key>'
openclaw config set env.ANTHROPIC_API_KEY '<key>'

Step 4 — Parse the request

Extract topic, days, and provider from what the user said:

For AI provider:

User said--provider--model
"use OpenAI" / "with GPT" / "using ChatGPT" / nothing specified(omit — default)(omit)
"use Claude" / "with Anthropic"--provider claude(omit)
"use DeepSeek"--provider deepseek(omit)
"use GPT-4o" / "with gpt-4o"--provider openai--model gpt-4o
"use claude sonnet"--provider claude--model claude-3-5-sonnet-20241022
"use deepseek reasoner"--provider deepseek--model deepseek-reasoner

Extract topic and days from what the user said:

User said--topic--days
"AI news" / "tech news" / nothing specific(omit — default AI)1
"crypto news"--topic "crypto"1
"news about climate change"--topic "climate change"1
"last week's crypto news"--topic "crypto"7
"last 3 days of Bitcoin news"--topic "Bitcoin"3
"yesterday's AI news"(omit topic)1
"this week in AI"(omit topic)7

For report type:

User saidflag to add
"news" / "articles" / "digest"--report news
"trending" / "Twitter" / "YouTube"--report trending
"dry run" / "preview" / "don't post"--dry-run
"test Discord" / "test webhook"--test-discord
anything else(omit — runs all)

Step 5 — Run with uv

uv run automatically installs all dependencies from the script's inline metadata — no venv setup needed.

uv run "$SCRIPT" [--topic "TOPIC"] [--days N] [--report TYPE] [--provider PROVIDER] [--model MODEL] [--dry-run]

Examples:

# AI news today — OpenAI (default)
uv run "$SCRIPT"

# Crypto news using OpenAI
uv run "$SCRIPT" --topic "crypto" --provider openai

# Last week's climate news using Claude
uv run "$SCRIPT" --topic "climate change" --days 7 --provider claude

# Use a specific model
uv run "$SCRIPT" --topic "Bitcoin" --provider openai --model gpt-4o

# Trending AI on Twitter and YouTube
uv run "$SCRIPT" --report trending

# Preview without posting to Discord
uv run "$SCRIPT" --topic "Bitcoin" --dry-run

# Test webhook connection
uv run "$SCRIPT" --test-discord

Step 6 — Report back

Tell the user what was posted to Discord, how many items were found per source, and note any skipped sources (e.g. "YouTube skipped — YOUTUBE_API_KEY not set").

Comments

Loading comments...