Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

X-Scout

v1.0.0

X/Twitter intelligence scraper. Search tweets, scrape profiles, pull comments, auto-transcribe videos. Classify tweets as replicable methods vs content. CLI...

0· 186·1 current·1 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for aces1up/x-scout.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "X-Scout" (aces1up/x-scout) from ClawHub.
Skill page: https://clawhub.ai/aces1up/x-scout
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required env vars: TWITTERAPI_KEY
Required binaries: python3, curl
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install x-scout

ClawHub CLI

Package manager switcher

npx clawhub@latest install x-scout
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (Twitter/X scraping, optional method classification and transcription) align with required binaries (python3, curl), the required TWITTERAPI_KEY, and optional keys (OpenRouter, Deepgram). The included Python script and setup script implement scraping, classification and transcription paths that match the description.
Instruction Scope
SKILL.md and setup.sh instruct the agent/user to run setup.sh, create a .env and a config file in ~/.x-scout, and then run x_scout.py. The runtime instructions do not attempt to read unrelated system files, but they do persist keys and an install_id to disk and the runtime code performs telemetry (silently POSTs usage data including an install_id and a hashed query to clawagents.dev). These behaviors are disclosed in the SKILL.md, but they are privacy-relevant and worth the user's attention.
Install Mechanism
No remote archive downloads or obscure install hosts. setup.sh creates a local venv and runs pip install -r requirements.txt (requests, python-dotenv). No high-risk download URLs or extract-from-arbitrary-URL steps are used. yt-dlp is optional and not automatically downloaded.
Credentials
Only TWITTERAPI_KEY is required (declared as primary). Other API keys (OpenRouter, Cerebras, Deepgram) are optional and used only for optional features (method detection, query optimization, transcription). The script does store these keys to .env and ~/.x-scout/config.json in plaintext, which is reasonable for a CLI tool but is a privacy/security consideration for secret management.
Persistence & Privilege
always:false (no forced global inclusion). The setup writes files to $SCRIPT_DIR/.env and ~/.x-scout/config.json and registers an install_id; the runtime reports usage on each run to the analytics endpoint. The skill does not modify other skills or agent-wide settings. The persistent telemetry + stored install_id means activity can be correlated over time; this is disclosed but worth user consideration.
Assessment
This skill is internally consistent with its stated purpose, but review the privacy implications before installing. Setup.sh will: create a virtualenv, write your API keys to a .env in the script directory and to ~/.x-scout/config.json (stored in plaintext), and POST a registration to https://clawagents.dev. The runtime script will silently POST usage telemetry (install_id, a short hash of queries, result counts, errors) to the same analytics endpoint on every run. If you want to reduce risk: (1) avoid entering highly privileged/long-lived credentials if not necessary (use separate limited API keys), (2) inspect or modify setup.sh to skip the analytics registration or block outbound calls to clawagents.dev at the network level, (3) consider running the tool in an isolated environment (container or throwaway VM), and (4) rotate or revoke keys stored by the tool when you stop using it. If you need a different behavior (no telemetry, encrypted key storage), ask the author for a build that omits telemetry or implement those changes locally before using.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

🔍 Clawdis
Binspython3, curl
EnvTWITTERAPI_KEY
Primary envTWITTERAPI_KEY
intelligencevk974px9hhxenq31jgkn36p95gd8340k1latestvk974px9hhxenq31jgkn36p95gd8340k1researchvk974px9hhxenq31jgkn36p95gd8340k1scrapingvk974px9hhxenq31jgkn36p95gd8340k1twittervk974px9hhxenq31jgkn36p95gd8340k1xvk974px9hhxenq31jgkn36p95gd8340k1
186downloads
0stars
1versions
Updated 22h ago
v1.0.0
MIT-0

X-Scout

X/Twitter intelligence scraper. Search by keyword, scrape profiles, pull comments, auto-transcribe videos. Classifies tweets as replicable methods vs general content.

Setup

Before first use, run the setup script to configure your API keys:

bash setup.sh

This prompts for your TwitterAPI.io key (required) and optional keys for method detection and video transcription. Your install is registered with ClawAgents for usage tracking.

If you already have keys configured, set them as environment variables:

export TWITTERAPI_KEY=your_key_here

Modes

Search tweets by keyword:

python3 x_scout.py --search "ai agent" --limit 20

Scrape profile posts:

python3 x_scout.py --profile @elonmusk --limit 10

Pull comments/replies on a tweet:

python3 x_scout.py --comments "https://x.com/user/status/123456"

Full intel (tweet + video + comments + transcription):

python3 x_scout.py --intel "https://x.com/user/status/123456"

Options

FlagDescriptionDefault
--search "query"Search tweets by keyword--
--profile @handleScrape profile posts--
--comments <url>Pull replies to a tweet--
--intel <url>Full intel on a tweet--
--limit NMax results20
--since YYYY-MM-DDDate filter180 days ago
--no-methodsSkip method detectionmethods on by default
--no-transcribeSkip video transcriptiontranscribes if key set
--jsonOutput as JSONtable view

Method Detection

When enabled (default), X-Scout classifies each tweet as:

  • METHOD: Describes a specific tool, technique, or workflow (replicable)
  • CONTENT: General commentary, results showcase, promotional

For METHOD tweets, it extracts: method name, tools required, category, complexity, and a summary. Requires an OpenRouter API key.

Auto-Transcription

Any tweet with an embedded video is automatically:

  1. Downloaded via yt-dlp
  2. Transcribed via Deepgram
  3. Transcript included in output and method classification

Requires a Deepgram API key (set during setup).

Output

Table view (default) shows: author, likes, views, retweets, media type, transcript status, and tweet preview.

JSON view (--json) outputs full structured data for piping to other tools.

Required Keys

KeyRequired?What It DoesCost
TWITTERAPI_KEYYesTweet search, profile scrape, replies~$50/mo
OPENROUTER_API_KEYOptionalMethod detection via GrokPay-per-use
CEREBRAS_API_KEYSOptionalQuery optimizationFree tier
DEEPGRAM_API_KEYOptionalVideo transcriptionFree tier

Comments

Loading comments...