Bioinfo Daily

v1.0.1

每日生物信息学与肿瘤学研究进展日报生成器。使用 PubMed API 自动搜索前一天的 CNS 及 Nature Index 期刊文献,筛选生物信息学、肿瘤免疫、单细胞测序、空间转录组等领域的高影响力研究,生成包含中文亮点介绍的日报。使用场景:(1) 设置定时任务每天自动获取研究进展并发送到飞书 (2) 手动查询...

2· 327·1 current·1 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The skill implements a PubMed-based daily-report generator and only requests capabilities that make sense for that purpose (calling PubMed E-utilities, filtering high‑impact journals, optional Feishu upload). However, the registry metadata claims no required environment variables while SKILL.md and the scripts clearly require NCBI_EMAIL and NCBI_API_KEY. This is an inconsistency in metadata (likely oversight) but not malicious.
Instruction Scope
Runtime instructions and scripts stay within the declared scope: they read a local .env in the skill directory (or environment/openclaw config), call PubMed APIs (ncbi.nlm.nih.gov), optionally use OpenClaw's web_search tool, generate a /tmp text/markdown report, and optionally prepare content for Feishu. There are no instructions to read unrelated system secrets or transmit data to unknown endpoints.
Install Mechanism
This is effectively instruction-plus-scripts (no install spec that downloads external archives). All code is included in the skill bundle; there are no network downloads or non-standard install steps in the package files provided.
Credentials
The scripts legitimately require NCBI_EMAIL and NCBI_API_KEY (and optionally Feishu credentials for auto-upload). Those environment variables are appropriate for the task. The concern is that the registry-level 'Required env vars' field is empty, so the platform metadata under-reports required credentials — the omission should be corrected before automated installs.
Persistence & Privilege
The skill does not request always:true, does not modify other skills, and its runtime artifacts are limited to reading local config/.env and writing report files under /tmp and within the skill directory. Cron wrappers are provided but nothing forces permanent or elevated privileges.
Assessment
This skill appears to do what it says: fetch PubMed results, filter high‑impact journals, and generate a Chinese daily report. Before installing: (1) be aware you must provide NCBI_EMAIL and NCBI_API_KEY (the skill will read them from environment, ~/.openclaw/openclaw.json, or a .env in the skill dir) — the registry listing currently omits this; (2) if you enable automated upload, you will need to provide Feishu credentials — otherwise the script only writes files for manual upload; (3) search_bioinfo.py invokes 'openclaw web_search' (which uses your configured search provider) — confirm that web_search is configured to a provider you trust; (4) run the scripts in an isolated environment if you are concerned about network activity; and (5) because the source is 'unknown' and the package metadata points to a GitHub repo, consider reviewing or running the code locally before granting it cron/automatic execution.

Like a lobster shell, security has layers — review code before you run it.

latestvk978ckxpmmwyfn68hqyfg3v4jd82qnnw

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments