财政数据采集分析
v1.0.0中国财政部财政收支数据采集与分析。当用户提到以下场景时使用本 skill:(1) 抓取财政数据 - 触发词:抓取财政数据、采集财政数据、最新财政数据、财政数据采集;(2) 分析财政数据 - 触发词:分析财政数据、分析财政赤字、研究财政收入、对比财政收支。负责运行财政部官网的财政数据采集 pipeline,并对采集...
⭐ 0· 77·0 current·0 all-time
by@cy7533
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description (采集并分析财政部财政收支数据) match the code and runtime instructions: a crawler (requests + BeautifulSoup) that reads Ministry of Finance pages, parsers that extract cumulative values and derive monthly values, transform/export modules that write Excel outputs. No unrelated credentials, binaries, or services are requested.
Instruction Scope
SKILL.md tells the agent to activate a conda env and run the included run_pipeline.py which enumerates listing pages, fetches official MOF pages, parses content, and writes outputs under an output directory. This stays within the stated purpose. Minor note: SKILL.md recommends storing artifacts under $WORKSPACE/output/artifacts, while the code writes to the provided --output-dir (default 'output') and creates per-period and summary directories; this is an operational mismatch but not malicious.
Install Mechanism
There is no platform install spec in the registry entry (instruction-only), but the bundle includes code and an environment.yml / requirements.txt that use common packages (requests, beautifulsoup4, lxml, openpyxl). Creating the conda env will install those packages — reasonable for a scraper but the user should run in an isolated environment (no remote arbitrary downloads during install).
Credentials
The skill requires no environment variables, secrets, or external credentials. It only performs HTTP GETs to the official MOF domain and writes local Excel files — request scope is proportional to purpose.
Persistence & Privilege
Skill is not always-included and does not request elevated privileges. It writes outputs into the configured output directory but does not modify other skills or system-wide configurations.
Assessment
This skill appears to be what it says: a web crawler + ETL for Ministry of Finance announcements. Before running it: (1) inspect the shipped code (you have it) and run it in an isolated environment (create the conda env named scrapyEnv as instructed); (2) run with --output-dir pointing to a dedicated directory you control (not your OS home) to review generated files; (3) be aware it will make HTTP requests to https://www.mof.gov.cn and follow listing pages — if you need to restrict network access, run it in a sandbox or offline; (4) parsing is regex-based and brittle: verify outputs and test reruns for missing-previous-period warnings; (5) only proceed if you trust the skill source — although no secrets are requested, executing code from unknown authors carries risk, so prefer running in a disposable VM or container first.Like a lobster shell, security has layers — review code before you run it.
latestvk97cwfjqeyads1ffp1z9bbw9rd83qvvh
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
