Lead Hunter
v1.0.0Autonomous lead generation skill. Finds freshly-funded companies matching your ideal customer profile, researches them, and delivers qualified leads with per...
⭐ 0· 86·0 current·0 all-time
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
OpenClaw
Benign
medium confidencePurpose & Capability
The name/description (autonomous lead generation) matches the SKILL.md and included files. The skill scrapes funding/news sources, filters leads by ICP, researches companies, scores them, and writes outputs to markdown/CSV/Asana as documented — the single Python scraper file and config.json are directly relevant.
Instruction Scope
Instructions explicitly read/write only skill-local files (scripts/config.json, scripts/seen.json, leads/, memory/). The agent is instructed to use platform tools (web_fetch, web_search, managed browser) and the local scraper as a fallback. Onboarding offers to create a cron job automatically — this is outside the skill directory and should be confirmed with the user before creation. The skill also references an Asana helper command (node skills/asana-pat/...), which relies on another skill or user-supplied Asana PAT if Asana output is chosen.
Install Mechanism
No formal install spec in registry, but scripts/scrape.py will auto-create a Python venv and run pip to install crawl4ai and Playwright (and will download Chromium). This is standard for a scraper but means the skill will download and install third‑party packages and a browser binary into the skill folder on first run — review/approve this action before executing.
Credentials
The skill does not request environment variables or credentials in the registry metadata. Asana integration is optional and would require the user to supply an Asana PAT externally; nothing in the skill silently exfiltrates secrets. The amount and type of access requested (local file reads/writes, network for scraping) are proportional to the stated function.
Persistence & Privilege
The skill is not forced always-on and does not declare elevated platform privileges, but onboarding allows creating a cron job for scheduled runs and the scraper will write a venv and state files under the skill directory. Creating system cron entries is meaningful persistence and should be user-approved.
Assessment
This skill appears coherent with its stated purpose, but take these precautions before installing or running it: 1) Inspect scripts/scrape.py and scripts/config.json yourself (or run in a sandbox) — the scraper will create a .venv, pip-install crawl4ai and Playwright, and download Chromium. 2) Start with output type=markdown so no external service credentials are needed; only supply an Asana PAT if you trust the code and want Asana output. 3) Be cautious when allowing the agent to create a cron job — confirm scheduling actions explicitly. 4) Review and limit the source list (config.sources) to trusted sites to avoid excessive scraping or potential TOS issues (LinkedIn scraping may violate some sites' terms). 5) Run python3 scripts/scrape.py --check first to see what will be installed, and consider running the skill in an isolated environment if you have security concerns.Like a lobster shell, security has layers — review code before you run it.
latestvk9786tn7a7vp4rkwepmgzws4ss83b6yg
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
