GetMarkdown
PassAudited by ClawScan on May 14, 2026.
Overview
This instruction-only skill transparently uses WebCrawlerAPI to scrape or crawl user-requested URLs, with disclosed API-key use and scoped local output.
Use this skill if you are comfortable sending requested URLs or domains to WebCrawlerAPI and storing crawl results locally. Set the API key in a trusted shell, avoid crawling private/internal sites unless appropriate, and review before increasing the default crawl limit.
Publisher note
The API key must be set as an environment variable before running any curl commands: user can get it here https://dash.webcrawlerapi.com/access
Findings (3)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
The agent can make WebCrawlerAPI requests using the user's key, which may affect account usage or billing.
The skill uses a WebCrawlerAPI bearer token for its provider requests. This is expected for the service, but users should recognize it as account credential use.
export WEBCRAWLERAPI_API_KEY="your_api_key" ... "Authorization: Bearer ${WEBCRAWLERAPI_API_KEY}"Use a scoped or revocable API key where possible, set it only in trusted environments, and monitor provider usage.
URLs or domains requested for scraping/crawling will be sent to WebCrawlerAPI.
The skill instructs the agent to run shell curl commands against an external API using user-supplied URLs. This is central to the purpose and bounded by a default crawl limit.
curl --fail --silent --show-error ... --url "https://api.webcrawlerapi.com/v1/crawl" ... "items_limit": 25
Use it only for URLs you are comfortable sharing with that provider, and review requests before crawling private or internal sites.
The skill may not work as expected unless those tools and the API key are available.
The instructions rely on local curl and python3 commands, while the registry requirements list no required binaries or environment variables. This is a disclosure/operability gap rather than malicious behavior.
curl --fail --silent --show-error ... STATUS=$(echo "$RESULT" | python3 -c "import sys,json; print(json.load(sys.stdin)['status'])")
The publisher should declare curl, python3, and WEBCRAWLERAPI_API_KEY in metadata; users should review the commands before running them.
