RSS监控

v1.0.0

RSS监控技能 - 监控RSS/Atom订阅源,检测更新,获取新内容。

0· 32·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for 534422530/laosi-rss.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "RSS监控" (534422530/laosi-rss) from ClawHub.
Skill page: https://clawhub.ai/534422530/laosi-rss
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install laosi-rss

ClawHub CLI

Package manager switcher

npx clawhub@latest install laosi-rss
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name and description describe RSS/Atom monitoring; the only dependency is feedparser and the provided Python sample implements feed parsing, update detection, and basic filtering — all coherent with the stated purpose.
Instruction Scope
SKILL.md instructs installing feedparser and shows explicit Python logic plus a simple curl example. Instructions do not request unrelated files, environment variables, or transmit data to unexpected endpoints. The curl/grep example is simplistic and brittle but not out-of-scope.
Install Mechanism
No install spec beyond advising `pip install feedparser` (a standard PyPI dependency). No arbitrary downloads or archive extraction; low install risk.
Credentials
The skill requires no environment variables, credentials, or config paths. Requested access is proportional to monitoring feeds over the network.
Persistence & Privilege
Skill is not always-on and does not request elevated persistence or modify other skills/configurations. The provided sample stores last_check in-memory (no persistent storage), which is a functional detail rather than a privilege escalation.
Assessment
This skill appears consistent with an RSS/Atom monitor. Before installing, note: (1) it will fetch remote feed URLs over the network — only add feeds you trust and consider privacy implications; (2) the provided example keeps update state in memory (no persistence) — if you need historical state, implement safe storage; (3) pip packages run code on install — installing feedparser from PyPI is common, but you may prefer to review or pin the dependency version; (4) the curl|grep example is brittle and may not correctly parse XML/HTML — prefer using the Python feedparser code for production. If you want stronger isolation, run the skill or its dependencies in a sandboxed environment.

Like a lobster shell, security has layers — review code before you run it.

feedvk97b7ncw12s2yzcg8fmjxssrzx85p9ejlatestvk97b7ncw12s2yzcg8fmjxssrzx85p9ejmonitorvk97b7ncw12s2yzcg8fmjxssrzx85p9ejrssvk97b7ncw12s2yzcg8fmjxssrzx85p9ej
32downloads
0stars
1versions
Updated 9h ago
v1.0.0
MIT-0

RSS Monitor - RSS监控

激活词: RSS监控 / 订阅更新 / Feed监控

安装

pip install feedparser

功能

  • 解析RSS/Atom feeds
  • 检测新内容
  • 过滤分类
  • 历史记录

Python函数

import feedparser
import time
from datetime import datetime

class RSSMonitor:
    def __init__(self):
        self.feeds = {}
        self.last_check = {}
    
    def add_feed(self, name: str, url: str):
        self.feeds[name] = url
    
    def check_updates(self) -> list:
        updates = []
        for name, url in self.feeds.items():
            feed = feedparser.parse(url)
            
            for entry in feed.entries[:5]:
                entry_time = datetime(*entry.published_parsed[:6])
                
                if name not in self.last_check or entry_time > self.last_check[name]:
                    updates.append({
                        'feed': name,
                        'title': entry.title,
                        'link': entry.link,
                        'published': entry.get('published', 'Unknown'),
                    })
            
            self.last_check[name] = datetime.now()
        
        return updates
    
    def get_entries(self, url: str, limit: int = 10):
        feed = feedparser.parse(url)
        return [{
            'title': e.title,
            'link': e.link,
            'summary': e.get('summary', '')[:200],
        } for e in feed.entries[:limit]]

命令行

# 解析RSS
curl -s "https://example.com/feed.xml" | grep -o '<title>.*</title>'

使用场景

  1. 监控技术博客更新
  2. 跟踪新闻源
  3. 关注播客��集
  4. 监控社交媒体动态

Comments

Loading comments...