AI Profit Engine
SuspiciousAudited by ClawScan on May 10, 2026.
Overview
This skill is an opportunity scanner, but it embeds a Moltbook bearer token and runs an unbundled workspace script, so it needs review before use.
Review or remove the hardcoded Moltbook token and the missing Polymarket helper call before installing. If you still use the skill, run it manually first, check what files it executes from your workspace, and only enable hourly cron after confirming the code and credentials are under your control.
Findings (3)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
Requests may run under an unknown shared account or exposed token rather than the user's own credential, which can create account, rate-limit, or authorization problems.
The script sends requests with a hardcoded Moltbook bearer token even though the registry declares no primary credential or required environment variable, and SKILL.md describes MOLTBOOK_API_KEY as configurable.
-H "Authorization: Bearer moltbook_sk_..."
Remove the embedded token, require a user-provided MOLTBOOK_API_KEY, declare it in metadata, and document exactly what account access is used.
If a different or tampered file exists at that workspace path, running this skill could execute arbitrary local commands under the user's account.
After changing into the workspace, the script runs a relative helper script that is not included in the provided file manifest, creating an unreviewed code dependency and possible path-hijack behavior.
cd "$HOME/.openclaw/workspace" || exit 1 ... bash scripts/polymarket_wallet_monitor.sh >> "$LOG" 2>&1
Bundle and review the helper script, call it from the skill's own installation directory, verify its provenance, or remove the call until the dependency is provided.
If enabled in cron, the scanner will continue running on a schedule until the user disables it.
The skill recommends periodic background execution. This matches its monitoring purpose, but users should understand it will keep making network requests and writing logs if scheduled.
Run hourly via cron for continuous opportunity detection.
Only add the cron job if continuous monitoring is desired, and periodically review the log file and network/API usage.
