Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

xfetch

Use xfetch CLI to fetch X/Twitter data - tweets, user profiles, search results, timelines, lists, DMs, and notifications. Use this skill whenever you need to...

MIT-0 · Free to use, modify, and redistribute. No attribution required.
1 · 150 · 1 current installs · 2 all-time installs
byPengfei Ni@feiskyer
MIT-0
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The name/description state the tool fetches X/Twitter data and the SKILL.md describes exactly that (tweets, profiles, DMs, notifications, exports). The requested capabilities (cookie-based auth, pagination, output formats) are coherent with a scraper CLI.
!
Instruction Scope
The SKILL.md explicitly instructs extracting cookies from the user's browser (chrome/firefox/safari/arc/brave and specific profiles), setting auth tokens, reading/writing cursor state and output DB/files, and accessing DMs and bookmarks. Those actions require reading local browser profile data and writing local files — sensitive operations not declared elsewhere. The instructions also allow proxy URLs with credentials and proxy-file rotation, which could cause credential handling/storage concerns.
Install Mechanism
There is no install spec (instruction-only), which is low risk by itself, but the markdown references running the CLI via 'npx @lxgic/xfetch' / 'bunx @lxgic/xfetch' and says it's installed globally as 'xfetch'. That implies runtime downloading/executing an npm package from an external registry (supply-chain risk). The skill does not supply a vetted install source or verify package integrity.
!
Credentials
requires.env is empty, but the instructions require access to local browser cookies/profiles and accept proxy URLs (which can include credentials). The skill can store auth tokens and output files. These are highly sensitive capabilities relative to the simple 'fetch tweets' description and should be explicitly declared and justified.
Persistence & Privilege
The skill is not marked 'always:true' and is user-invocable; it does instruct saving and clearing its own auth state but does not request persistent platform privileges or modify other skills. Autonomous invocation is allowed (platform default) but not combined with an 'always' flag.
What to consider before installing
This SKILL.md is coherent with a cookie-based X/Twitter scraper, but it requires reading browser cookies/profiles (sensitive) and implies using an npm package (@lxgic/xfetch) from an unknown source. Before installing or using it: 1) Confirm where the 'xfetch' binary would come from and review the npm package source and maintainer; 2) Consider the privacy risk of allowing access to your browser profile/cookies and DMs — don't run it on machines with sensitive accounts; 3) Prefer using official APIs with scoped credentials where possible; 4) If you must run it, do so in an isolated environment (VM/container) and inspect where it stores auth tokens and any downloaded code; 5) Be aware this may violate X/Twitter terms of service and could expose private messages and tokens if misused.

Like a lobster shell, security has layers — review code before you run it.

Current versionv1.0.0
Download zip
latestvk972ztgn37vdxb93g4k8dc2rqh82ceh2

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

SKILL.md

xfetch - X/Twitter CLI Data Fetcher

xfetch is a cookie-based X/Twitter scraper CLI. It requires no API keys - just browser cookies for authentication. It's installed globally as xfetch (or can be run via npx @lxgic/xfetch / bunx @lxgic/xfetch).

Prerequisites

Authentication must be set up first. Check with:

xfetch auth check

If not authenticated, extract cookies from the user's browser:

xfetch auth extract --browser chrome          # or firefox, safari, arc, brave
xfetch auth extract --browser chrome --profile "Profile 1"  # specific profile

Or set tokens directly:

xfetch auth set --auth-token <token> --ct0 <token>

Command Reference

Single Tweet

Accepts a tweet URL or numeric ID:

xfetch tweet https://x.com/user/status/123456789
xfetch tweet 123456789

Also works with X Article URLs (/article/ID).

User Profile

xfetch user @handle        # by handle (@ is optional)
xfetch user 12345678       # by numeric ID

User Tweets

xfetch tweets @handle                     # latest 20 tweets
xfetch tweets @handle -n 50              # 50 per page
xfetch tweets @handle --replies           # include replies
xfetch tweets @handle --all              # all pages (paginated)
xfetch tweets @handle --max-pages 5      # limit to 5 pages

Thread / Conversation

xfetch thread <url-or-id>    # full conversation thread

Search

xfetch search "query"                          # top results
xfetch search "query" --type latest            # latest tweets
xfetch search "query" --type people            # user results
xfetch search "query" --type photos            # photo tweets
xfetch search "query" --type videos            # video tweets
xfetch search "from:handle since:2024-01-01"   # advanced operators
xfetch search "query" -n 100 --all             # all pages

Search types: top (default), latest, people, photos, videos.

Timelines

xfetch home                    # algorithmic home timeline
xfetch home --following        # chronological (following only)
xfetch bookmarks               # your bookmarks
xfetch likes @handle           # user's liked tweets

Followers / Following

xfetch followers @handle -n 100
xfetch following @handle -n 100

Lists

xfetch lists @handle                        # user's lists
xfetch list <list-id-or-url>                # list details
xfetch list-members <list-id-or-url>        # list members
xfetch list-tweets <list-id-or-url> -n 50   # list timeline

Direct Messages

xfetch dms                                  # inbox overview
xfetch dms inbox                            # same as above
xfetch dms conversation <conversation_id>   # messages in a conversation
xfetch dms <conversation_id>                # shortcut for above

Notifications

xfetch notifications                  # all notifications
xfetch mentions                       # mentions only
xfetch verified-notifications         # from verified accounts

Auth Management

xfetch auth check                             # show auth status
xfetch auth extract --browser chrome          # extract cookies
xfetch auth set --auth-token <t> --ct0 <t>    # set tokens manually
xfetch auth clear                             # clear saved auth
xfetch auth browsers                          # list available browsers + profiles

Query ID Management

xfetch query-ids --refresh    # fetch latest query IDs from X frontend
xfetch query-ids --list       # show cached query IDs

Pagination Options

All list-like commands (tweets, search, followers, following, likes, bookmarks, home, notifications, mentions, list-members, list-tweets, dms inbox, dms conversation) support these pagination options:

FlagDescription
-n, --count <N>Results per page (default: 20)
--allFetch all available pages
--max-pages <N>Maximum number of pages to fetch
--cursor <cursor>Resume from a specific cursor
--resume <file>Save/restore cursor state to a file
--delay <ms>Delay between page requests (default: 1000ms)

Output Formats

Control output with --format:

xfetch tweets @handle --format json      # pretty-printed JSON (default)
xfetch tweets @handle --format jsonl     # newline-delimited JSON (streaming)
xfetch tweets @handle --format csv       # CSV with headers
xfetch tweets @handle --format sqlite --db tweets.db   # SQLite database

Additional output flags:

  • --json — shorthand for --format json
  • --plain — disable pretty printing

Pipe to files with shell redirection:

xfetch tweets @handle -n 100 --format csv > tweets.csv
xfetch search "AI" --all --format jsonl > results.jsonl

Global Options

These can be passed to any command:

FlagDescription
--auth-token <token>Override auth_token cookie
--ct0 <token>Override ct0 cookie
--cookie-source <src>Cookie source browser
--chrome-profile <name>Chrome profile name
--proxy <url>Proxy URL (http://user:pass@host:port)
--proxy-file <path>File with proxy URLs for rotation
--timeout <ms>Request timeout (default: 30000)
--delay <ms>Delay between requests (default: 500)
--no-colorDisable colored output

Tips for Effective Use

  • When fetching large datasets, prefer --format jsonl for streaming output (avoids buffering entire result set in memory).
  • Use --max-pages instead of --all when you only need a sample — it's faster and avoids rate limits.
  • If a request fails with a 404 or query ID error, try xfetch query-ids --refresh to update cached query IDs — X periodically rotates these.
  • Pipe JSON output through jq for filtering: xfetch tweets @handle | jq '.[].text'
  • For SQLite output, always specify --db: xfetch tweets @handle --format sqlite --db data.db
  • Rate limits are handled internally with jitter, but high-volume scraping benefits from --delay 2000 or higher.

Files

1 total
Select a file
Select a file to preview.

Comments

Loading comments…