Skill flagged — suspicious patterns detected
ClawHub Security flagged this skill as suspicious. Review the scan results before using.
Tender Analysis System
v1.0.0自动化抓取并智能分析各行业招标信息,生成多维度结构化标书风险与付款方式分析报告。
⭐ 0· 60·1 current·1 all-time
by@ylbwjf
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Suspicious
medium confidencePurpose & Capability
SKILL.md and README describe automated hourly scraping of many external sites and LLM-based deep analysis. The repository contains a reporter and DB helper but no implemented scraping/crawl logic or push implementation; requests is imported but not used for crawling. That means the declared purpose (automated scraping + analysis) is not implemented by the code as-is — a notable mismatch.
Instruction Scope
Runtime instructions and scripts (run.sh, cron_setup.sh, SKILL.md) limit actions to installing dependencies, running the analyzer, and scheduling cron jobs. The SKILL.md asks users to configure API keys/webhooks in config.yaml; there are no instructions to read unrelated system files. However, SKILL.md expects LLM integration and push channels that would require network access and credentials provided in config.yaml.
Install Mechanism
There is no packaged install step; run.sh performs a pip install of small, common packages (pyyaml, requests). No downloads from external arbitrary URLs or archive extraction are present. This is low-to-moderate install risk.
Credentials
The project does not declare required environment variables, yet config.yaml contains fields for LLM api_key, Feishu webhook, and SMTP credentials (including a pre-filled username). These credentials are expected to be stored in plaintext in config.yaml rather than environment variables, increasing risk of accidental leakage. No unrelated high-privilege credentials are requested, but plaintext credential storage is disproportionate and risky.
Persistence & Privilege
cron_setup.sh will add crontab entries to run the script hourly and every 2 hours for reports. Scheduling itself is reasonable for a monitoring tool, but it modifies the user's crontab (persistence) and will run periodically if the user executes that script — users should be aware and review before running.
What to consider before installing
This package looks like a template for a tender-scraping and analysis system but the actual crawling and push logic are missing. Before installing or running: 1) Don't place secrets (LLM API keys, SMTP passwords, webhook URLs) in config.yaml on shared machines — prefer environment variables or a secure secret store. 2) Inspect and, if needed, implement or review any scraping code to ensure it follows robots.txt and legal constraints; the current code does not perform scraping. 3) Review cron_setup.sh before running — it will modify your crontab and create persistent scheduled jobs. 4) If you plan to enable push notifications (Feishu/SMTP) verify webhook URLs and SMTP credentials and test in an isolated environment. 5) If you need the described scraping features, request or implement the missing network/crawl modules and re-audit them for unexpected endpoints or data exfiltration.Like a lobster shell, security has layers — review code before you run it.
latestvk9745h1sde2mhpq3ca10wvrdex8454wp
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
