Data Enricher

v1.0.0

Enrich leads by finding verified emails via contact pages, Instagram, Hunter.io, and patterns, then format data for Notion with deduplication and batching.

1· 1.7k·4 current·4 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for visualdeptcreative/data-enricher.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Data Enricher" (visualdeptcreative/data-enricher) from ClawHub.
Skill page: https://clawhub.ai/visualdeptcreative/data-enricher
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install data-enricher

ClawHub CLI

Package manager switcher

npx clawhub@latest install data-enricher
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
high confidence
!
Purpose & Capability
The SKILL.md describes using the Hunter.io API and syncing/checking Notion, but the skill metadata declares no required environment variables or credentials. That is incoherent: Hunter.io requires an API key and Notion requires a token. The SKILL.md also lists specific models to use (ollama/llama3.2 and 'haiku'), which the metadata does not explain or constrain.
!
Instruction Scope
Runtime instructions tell the agent to fetch website contact pages and Instagram bios, call Hunter.io, perform email-guessing heuristics, check Notion for duplicates, and save batches to workspace/leads-enriched-YYYY-MM-DD.json. Those steps involve network access, scraping, and file writes — none of which are declared or constrained. The instructions also reference {HUNTER_API_KEY} and checking Notion but do not specify how credentials will be provided or how to authenticate.
Install Mechanism
No install specification and no code files are present; the skill is instruction-only. That reduces the risk of arbitrary code being downloaded or written to disk. (Instruction-only skills still require network and credential access during runtime.)
!
Credentials
The SKILL.md implicitly requires at least a HUNTER_API_KEY and some Notion API token or credentials to perform domain lookups and Notion deduplication/sync, but the metadata lists no required env vars. The absence of declared credentials is disproportionate to the described functionality and is a material omission.
Persistence & Privilege
always is false and the skill is user-invocable (normal). The instructions ask the agent to write JSON files into a workspace path and to check/sync Notion; these are ordinary but mean the agent will read/write local workspace files and use external APIs. There is no request to modify other skills or system settings.
What to consider before installing
Do not install or enable this skill until the author corrects its metadata. Specifically: 1) Require and document HUNTER_API_KEY and the Notion credential (NOTION_TOKEN or similar) in requires.env so you know what secrets the skill needs. 2) Confirm where workspace/leads-enriched-YYYY-MM-DD.json will be written and that you approve that file location and retention. 3) Verify legal/ToS/privacy implications of scraping websites and Instagram (and ensure you have permission to collect and store personal emails). 4) Ask the author to clarify the listed models ('haiku' for API calls is unusual) and to provide explicit instructions for authentication and rate limiting. 5) Only provide API keys with least privilege and consider using short-lived tokens or a dedicated account. If the author cannot or will not provide these clarifications, treat the skill as unsafe to run because it can collect and store contact data while not declaring the credentials it needs.

Like a lobster shell, security has layers — review code before you run it.

latestvk9728tvsg688rryvb0bzad2msx80kqwg
1.7kdownloads
1stars
1versions
Updated 1mo ago
v1.0.0
MIT-0

SKILL.md - Data Enricher

Purpose

Enrich leads with email addresses and format data for Notion.

Model to Use

  • ollama/llama3.2:8b (FREE) for data formatting
  • haiku for Hunter.io API calls

Rate Limits

  • Max 10 Hunter.io lookups per session (API limit)
  • 5 seconds between API calls
  • Batch similar domains together

Email Discovery Methods (In Order)

1. Website Contact Page

  • Check /contact, /about, /pages/contact
  • Look for mailto: links
  • Check footer

2. Instagram Bio

  • Check bio for email
  • Check "Contact" button

3. Hunter.io API

GET https://api.hunter.io/v2/domain-search
?domain={domain}
&api_key={HUNTER_API_KEY}

Response includes:

  • emails[]
  • confidence score
  • type (generic/personal)

Only use emails with confidence > 70%

4. Email Pattern Guessing

Common patterns:

Email Priority

  1. Founder/owner personal email (best)
  2. hello@ or hi@ (good)
  3. info@ or contact@ (okay)
  4. Generic support@ (last resort)

Output Format

{
  "domain_key": "brandname.com",
  "brand_name": "Brand Name",
  "niche": "skincare",
  "website_url": "https://brandname.com",
  "ig_handle": "@brandname",
  "followers_est": 15000,
  "contact_email": "hello@brandname.com",
  "email_confidence": "high",
  "email_source": "hunter.io",
  "source": "meta_ads",
  "status": "new"
}

Deduplication

Before adding any lead:

  1. Normalize domain: lowercase, remove www., remove https://
  2. Check if domain_key exists in Notion
  3. If exists, skip (don't duplicate)
  4. Log: "Skipped [domain] - already in pipeline"

Batch Processing

  • Process 10 leads at a time
  • Format all data before Notion sync
  • Save formatted batch to workspace/leads-enriched-YYYY-MM-DD.json

Comments

Loading comments...