Serper Search

v1.0.0

Search Google programmatically using the Serper.dev API. Returns structured organic results (title, URL, snippet, domain) for any query. Use when you need to...

0· 345· 1 versions· 1 current· 1 all-time· Updated 1mo ago· MIT-0

Serper.dev Search Skill

Fast, cheap Google Search API. Returns clean JSON. Best choice for lead generation pipelines.

Credentials

Requires SERPER_API_KEY in environment. Get a free key (2,500 searches) at serper.dev.

SERPER_API_KEY=your_key_here

Quick Usage

import requests, os

def serper_search(query: str, num: int = 10, gl: str = "us") -> list[dict]:
    """
    Search Google via Serper.dev.
    Returns list of: {title, link, snippet, displayedLink, domain}
    """
    headers = {
        "X-API-KEY": os.environ["SERPER_API_KEY"],
        "Content-Type": "application/json"
    }
    payload = {"q": query, "num": num, "gl": gl}
    r = requests.post("https://google.serper.dev/search", headers=headers, json=payload)
    r.raise_for_status()
    
    results = r.json().get("organic", [])
    # Add clean domain extraction
    import tldextract
    for result in results:
        ext = tldextract.extract(result["link"])
        result["domain"] = f"{ext.domain}.{ext.suffix}"
    return results

Lead Generation Usage

import tldextract

EXCLUDE_DOMAINS = {
    "yelp.com", "facebook.com", "tripadvisor.com", "houzz.com",
    "thumbtack.com", "homeadvisor.com", "angi.com", "angieslist.com",
    "bbb.org", "yellowpages.com", "manta.com", "bark.com",
    "google.com", "nextdoor.com", "linkedin.com", "instagram.com"
}

def search_business_leads(niche: str, city: str, num_queries: int = 3) -> list[str]:
    """
    Run multiple queries for a niche+city combo.
    Returns deduplicated list of direct business website URLs.
    """
    query_templates = [
        f'"{niche} {city}"',
        f'"{niche} company {city}"',
        f'"{niche} service {city}"',
        f'"{city} {niche} contractor"',
    ]
    
    seen_domains = set()
    urls = []
    
    for template in query_templates[:num_queries]:
        query = template + " -yelp -facebook -tripadvisor -houzz -thumbtack -homeadvisor"
        results = serper_search(query, num=10)
        
        for r in results:
            domain = r["domain"]
            if domain not in EXCLUDE_DOMAINS and domain not in seen_domains:
                seen_domains.add(domain)
                urls.append(r["link"])
    
    return urls

Available Endpoints

EndpointURLUse
Web searchhttps://google.serper.dev/searchOrganic results
Newshttps://google.serper.dev/newsNews articles
Imageshttps://google.serper.dev/imagesImage results
Placeshttps://google.serper.dev/placesGoogle Maps results
Shoppinghttps://google.serper.dev/shoppingProduct results

Places Endpoint (Local Business Discovery Alternative)

def serper_places(query: str, gl: str = "us") -> list[dict]:
    """
    Search Google Maps / Local Places.
    Returns: {title, address, phone, website, rating, reviews}
    """
    headers = {"X-API-KEY": os.environ["SERPER_API_KEY"], "Content-Type": "application/json"}
    payload = {"q": query, "gl": gl}
    r = requests.post("https://google.serper.dev/places", headers=headers, json=payload)
    return r.json().get("places", [])

# Places returns website URLs + phone numbers directly — great for lead gen!
# usage: results = serper_places("landscaping company Portland OR")
# each result: {title, address, phone, website, rating, ratingCount}

Response Structure

{
  "organic": [
    {
      "title": "Green Valley Landscaping | Portland OR",
      "link": "https://www.greenvalleylandscaping.com",
      "snippet": "Professional landscaping services in Portland...",
      "displayedLink": "www.greenvalleylandscaping.com",
      "position": 1
    }
  ],
  "searchParameters": {"q": "landscaping portland OR", "gl": "us", "num": 10}
}

Pricing Reference

PlanCostSearches
Free$02,500 searches
Starter$5050,000 searches
Standard$100150,000 searches

Rate Limits & Best Practices

  • No hard rate limit stated — but be reasonable
  • Add 0.5s delay between rapid consecutive calls
  • Cache results to avoid re-querying same niche+city
  • Use num=10 (default) — max is 100 per request
  • gl param controls country: "us", "gb", "au", "ca", etc.

Error Handling

def serper_search_safe(query: str, num: int = 10) -> list[dict]:
    try:
        return serper_search(query, num)
    except requests.exceptions.HTTPError as e:
        if e.response.status_code == 429:
            print("Rate limited — waiting 5s")
            import time; time.sleep(5)
            return serper_search(query, num)
        print(f"Serper error: {e}")
        return []
    except Exception as e:
        print(f"Unexpected error: {e}")
        return []

Version tags

latestvk9780v6k5rk4drkh34rj43ggjs8269y3