Car Scraper
v1.0.0车辆信息采集反爬 Skill - 从大搜车、懂车帝、汽车之家采集二手车/新车数据,输出 OpenClaw 兼容格式。USE FOR: 采集车辆列表、车辆详情、价格行情、商家信息。支持反爬对策: UA轮换、限速退避、Cookie管理、指纹伪装。
⭐ 0· 309·0 current·0 all-time
byStrue@jackiezhao-eng
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description (car data scraping from 大搜车/懂车帝/汽车之家) match the provided Python modules and SKILL.md. The scrapers, data model, and OpenClaw export code correspond to the stated capabilities.
Instruction Scope
SKILL.md instructs the agent to import scrapers and export functions only (no broad file reads). The code will perform network requests to target sites, manage cookies, rotate UAs, use rate limiting and optionally proxies. One side-effect: config.py creates output directories at import time (os.makedirs on package path). The anti-detection logic explicitly aims to evade site protections — functionally coherent but worth noting for policy/ethical review.
Install Mechanism
No install spec; this is instruction + code files only. Dependencies appear to be standard Python libs (requests, bs4) referenced in requirements.txt — no unknown remote download or extract steps detected.
Credentials
The skill does not request environment variables, credentials, or external config paths. It contains an (empty) proxy list and configurable headers in config.py; no secrets are required by default. If you add proxies with credentials, that would raise additional risk.
Persistence & Privilege
always:false and no code that attempts to modify other skills or global agent configuration. The only persistent filesystem action is creating output directories within the package path and writing scraped output (OpenClaw exports), which is expected for this purpose.
Assessment
This skill appears to be what it claims (a multi-site car-data scraper with anti-detection features). Before installing/run it: 1) Be aware it will perform outbound HTTP(S) requests to the target sites and may be used to circumvent anti-scraping measures (UA rotation, header fingerprinting, cookie handling, proxy support). Ensure you have the legal right and comply with the target sites' terms of service and local laws. 2) Run in a controlled environment (sandbox or isolated VM) if you're unsure, since the skill will write output files (config.py creates an output/ and openclaw/ directory inside the package). 3) Inspect and sanitize config.py before use — remove or review any proxy entries (they may contain credentials) and set acceptable rate limits to avoid causing harm. 4) Monitor network activity and logs during initial runs. 5) If you need to avoid evasion capabilities, remove/disable anti_detect features (UA rotation, fingerprinting) so behavior is conservative and transparent.Like a lobster shell, security has layers — review code before you run it.
latestvk978485249gpqc8575svgmkx5582k9f3
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
