LinkedIn Jobs

v1.0.0

Search and monitor LinkedIn job listings with city-based filters, hourly cron support, and smart deduplication. Supports 100+ global tech hubs.

0· 138·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for yashsuman15/linkedin-jobs.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "LinkedIn Jobs" (yashsuman15/linkedin-jobs) from ClawHub.
Skill page: https://clawhub.ai/yashsuman15/linkedin-jobs
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required binaries: python
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install linkedin-jobs

ClawHub CLI

Package manager switcher

npx clawhub@latest install linkedin-jobs
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (LinkedIn job search + hourly monitoring) matches the included files: a scraper (linkedin_scraper.py), a cron/profile manager (linkedin_cron.py), geo IDs, and a config template. Required binary (python) and pip packages (requests, beautifulsoup4) are proportional to the stated purpose.
Instruction Scope
SKILL.md instructs the agent to run the provided Python scripts for one-off searches and to manage scheduled profiles. The scripts read/write local JSON files (search_profiles.json, seen_jobs.json, config.json) in the skill directory — which is consistent with the stated capabilities. There is no instruction to read unrelated system files, access secrets, or transmit data to endpoints other than LinkedIn.
Install Mechanism
Registry metadata indicated 'no install spec', but SKILL.md metadata includes an 'install' entry recommending installing pip packages (requests, beautifulsoup4). Installing via pip is proportional and expected; the minor inconsistency between registry-level install spec and SKILL.md metadata is worth noting but not dangerous.
Credentials
The skill requires no environment variables or credentials. It only needs Python and two common libraries. It writes local files for profiles and seen-job tracking — this is proportional to a monitoring/deduplication feature.
Persistence & Privilege
always:false and normal autonomous invocation are in place. The skill does create and update files in its own directory (search_profiles.json, seen_jobs.json) but does not request system-wide changes or modify other skills' configs. README suggests the agent can configure cron, but the code itself provides a runner rather than automatic cron installation.
Assessment
This skill appears to do what it says: it scrapes LinkedIn job pages and keeps local state (saved profiles and seen-job history). Before installing, consider: (1) it will run Python scripts on your machine/agent and create search_profiles.json and seen_jobs.json in the skill folder — make sure you are comfortable with local file writes; (2) scraping LinkedIn may violate LinkedIn's terms of service or trigger rate limits — keep delays and max_pages conservative; (3) the SKILL.md recommends installing pip packages (requests, beautifulsoup4) even though the registry listed no install spec — ensure dependencies are installed in a controlled environment; (4) the README suggests the agent can 'set up cron' but the included code only provides a runner you must schedule yourself (or let the agent create a cron job) — verify what automation the agent will perform; and (5) no credentials are requested by the skill, so there is no obvious secret-exfiltration risk. If you need higher assurance, review the two Python files for any unexpected network endpoints or obfuscated code (they appear to call only LinkedIn endpoints).

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

💼 Clawdis
Binspython
latestvk974ggr6d983fmwk5px4np1nsn84cw9y
138downloads
0stars
1versions
Updated 2w ago
v1.0.0
MIT-0

LinkedIn Job Search Skill

Search and monitor LinkedIn job listings with powerful filters. Supports 100+ global tech hubs with precise geo IDs, hourly monitoring via cron, and smart deduplication.

Configuration

After installation, optionally customize by copying config.example.json to config.json:

cp {baseDir}/config.example.json {baseDir}/config.json

Configurable options:

  • Default filters (experience, remote, date_posted)
  • Scraper delays and timeout
  • Notification preferences
  • Custom geo IDs for your cities

Capabilities

  1. One-time search: Search LinkedIn for jobs matching keywords and filters
  2. Scheduled monitoring: Add search profiles that run hourly via cron
  3. Smart deduplication: Only shows new jobs you haven't seen before
  4. Global city support: 100+ tech hubs with precise geo IDs

How to Use

One-Time Job Search

Use the exec tool to run a direct search:

python {baseDir}/linkedin_scraper.py --keywords "AI Engineer" --location "Bengaluru, India" --max-pages 2

Parameters:

ParameterDescriptionExample Values
--keywords, -kJob search keywords (required)"AI Engineer", "Python Developer"
--location, -lCity, country"Noida, India", "San Francisco", "Berlin"
--experience, -eExperience levels2 (Entry), 3 (Associate), 4 (Mid-Senior)
--remote, -rWork arrangement1 (On-site), 2 (Remote), 3 (Hybrid)
--date-posted, -dTime filterr86400 (24h), r604800 (1wk), r2592000 (1mo)
--job-type, -jEmployment typeF (Full-time), P (Part-time), C (Contract)
--max-pages, -pPages to scrape (25 jobs/page)1-5

Example - Entry level AI jobs in Noida, hybrid/on-site:

python {baseDir}/linkedin_scraper.py --keywords "AI Engineer" --location "Noida, India" --experience "2" --remote "1,3" --max-pages 2

Managing Search Profiles (for Hourly Monitoring)

When the user wants to set up recurring job searches, use these commands:

Add a new search profile:

python {baseDir}/linkedin_cron.py add --keywords "AI Engineer" --location "Bengaluru, India"

Add multiple job titles at once (comma-separated):

python {baseDir}/linkedin_cron.py add --keywords "AI Engineer, ML Engineer, Data Scientist" --location "Bengaluru, India"

This creates 3 separate profiles with the same location and filters, and deduplicates results across all of them.

Add with custom filters:

python {baseDir}/linkedin_cron.py add --keywords "Python Developer" --location "San Francisco" --experience "2" --remote "2,3"

List all search profiles:

python {baseDir}/linkedin_cron.py list

Run all enabled profiles now (for hourly cron or manual check):

python {baseDir}/linkedin_cron.py run

Run specific profile:

python {baseDir}/linkedin_cron.py run --profile ai-engineer-bengaluru

Enable/Disable a profile:

python {baseDir}/linkedin_cron.py enable --profile ai-engineer-bengaluru
python {baseDir}/linkedin_cron.py disable --profile ai-engineer-bengaluru

Remove a profile:

python {baseDir}/linkedin_cron.py remove --profile ai-engineer-bengaluru

Clear job history (to see all jobs again):

python {baseDir}/linkedin_cron.py clear-history

View statistics:

python {baseDir}/linkedin_cron.py stats

Supported Cities (100+ Global Tech Hubs)

The skill has built-in geo IDs for precise location-based results:

India: Bengaluru, Noida, Hyderabad, Mumbai, Delhi NCR, Pune, Chennai, Gurugram, Kolkata, Ahmedabad, Jaipur, Chandigarh, Kochi, Coimbatore, Indore, Lucknow

USA: San Francisco, New York, Seattle, Austin, Boston, Los Angeles, Chicago, Denver, San Diego, Washington DC, Atlanta, Dallas, Houston, Phoenix, Miami, Portland

UK: London, Manchester, Edinburgh, Cambridge, Oxford, Bristol, Birmingham, Leeds, Glasgow

Europe: Berlin, Amsterdam, Dublin, Paris, Munich, Zurich, Stockholm, Barcelona, Madrid, Milan, Vienna, Prague, Warsaw, Brussels, Copenhagen

Asia Pacific: Singapore, Sydney, Melbourne, Tokyo, Hong Kong, Seoul, Taipei, Kuala Lumpur, Jakarta, Bangkok, Shanghai, Beijing

Canada: Toronto, Vancouver, Montreal, Ottawa, Calgary, Waterloo

Middle East: Dubai, Abu Dhabi, Riyadh, Tel Aviv, Doha

Latin America: Sao Paulo, Mexico City, Buenos Aires, Bogota, Santiago

Africa: Johannesburg, Cape Town, Lagos, Nairobi, Cairo

For unlisted cities, the skill falls back to text-based search. You can also add custom geo IDs in config.json.

Output Format

The scraper returns JSON with job details including:

  • title: Job title
  • company: Company name
  • location: Job location
  • employment_type: Full-time, Part-time, Contract, etc.
  • experience_level: Entry level, Mid-Senior, etc.
  • posted_date: When the job was posted
  • requirements: Experience requirements extracted from description
  • tech_stack: Technologies mentioned (Python, TensorFlow, AWS, etc.)
  • role_summary: Brief description of the role
  • url: Direct link to apply

Formatting Job Notifications

When presenting new jobs to the user, format them clearly:

Found X new jobs for "[keywords]":

━━━ [Location] ━━━

1. [Title] @ [Company]
   📍 [Location] ([Remote/Hybrid/On-site])
   💼 [Employment Type] | [Experience Level]
   🕐 Posted [time ago]
   
   📋 Requirements:
   • Experience: [requirements]
   • Tech Stack: [tech_stack]
   • Role: [role_summary]
   
   🔗 [url]

User Intent Mapping

User SaysAction
"Search LinkedIn for X jobs in Y"Run one-time search with linkedin_scraper.py
"Monitor LinkedIn for X jobs"Add profile with linkedin_cron.py add
"Add X jobs in Y to my searches"Add profile with linkedin_cron.py add
"Search for AI Engineer, ML Engineer, Data Scientist in Bengaluru"Add multiple profiles with comma-separated keywords
"Monitor these roles: X, Y, Z in location"Add multiple profiles at once
"Stop searching for X"Disable or remove profile
"Show my job searches"Run linkedin_cron.py list
"Check for new jobs"Run linkedin_cron.py run
"Clear job history"Run linkedin_cron.py clear-history

Default Filters

When the user doesn't specify filters, use these defaults (configurable in config.json):

  • Experience: Entry level (code: 2)
  • Remote: On-site + Hybrid (codes: 1,3)
  • Date Posted: Last 24 hours (code: r86400)
  • Max Pages: 2 (~50 jobs)

Filter Code Reference

Experience Levels (--experience):

  • 1 = Internship
  • 2 = Entry level
  • 3 = Associate
  • 4 = Mid-Senior level
  • 5 = Director
  • 6 = Executive

Job Types (--job-type):

  • F = Full-time
  • P = Part-time
  • C = Contract
  • T = Temporary
  • I = Internship

Remote Options (--remote):

  • 1 = On-site
  • 2 = Remote
  • 3 = Hybrid

Date Posted (--date-posted):

  • r86400 = Last 24 hours
  • r604800 = Last week
  • r2592000 = Last month

Cron Setup (Optional)

Once you've configured your search profiles, you can ask the agent to set up automated monitoring:

  • "Run my LinkedIn job searches every hour"
  • "Check for new jobs every 30 minutes"
  • "Set up daily job monitoring at 9 AM"

The agent will configure the appropriate cron schedule in OpenClaw based on your preference.

Comments

Loading comments...