Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Google Colab GPU Runtime

v1.3.0

Execute code on Google Colab GPU runtimes (T4/L4/A100/H100) and manage persistent storage via Google Drive. Use when tasks need GPU compute (ML training, inf...

0· 204·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for isotrivial/colab.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Google Colab GPU Runtime" (isotrivial/colab) from ClawHub.
Skill page: https://clawhub.ai/isotrivial/colab
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install colab

ClawHub CLI

Package manager switcher

npx clawhub@latest install colab
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (Colab GPU runtimes + Drive persistence + TTS) align with the provided scripts: colab_run.py (assign/connect kernels, GPU selection), colab_drive.py (Drive upload/download), colab_tts.py (F5-TTS orchestration). No unrelated credentials or services are requested in metadata.
Instruction Scope
Runtime instructions include creating/updating ~/.colab-mcp-auth-token.json and optionally ~/colab-mcp-oauth-config.json, injecting that token into scripts sent to Colab, and mounting Drive from inside Colab. This is necessary for Drive persistence but means an OAuth token (including refresh token) is embedded into scripts uploaded to remote Colab runtimes — a sensitive but expected step for this functionality. Minor mismatch: SKILL.md says 'No browser needed (headless API)' while reauth_with_drive.py explicitly opens a browser or prints a URL for interactive auth.
Install Mechanism
This is instruction-only (no remote download/install spec). The scripts bootstrap a local .colab-venv and install Python deps using a helper tool named 'uv' (the README instructs 'pip install uv'). Re-exec behavior and venv creation are local and self-contained; there are no external arbitrary archive downloads or URL shorteners.
Credentials
The skill requires access to a local OAuth token file (~/.colab-mcp-auth-token.json) and optionally a client config file for reauth; these are directly related to Colab/Drive access and are proportionate. However, the token contains sensitive scopes and refresh tokens. The TTS helper optionally calls ElevenLabs APIs using an API key passed by the user (not stored by the skill) — also proportionate to the TTS feature.
Persistence & Privilege
The skill does write local state (.colab-venv/, ~/.colab-runtime-state.json, ~/.openclaw/private/ren-voice/) and creates temporary files (inject_and_run.sh creates a token-bearing temp script with restricted permissions and cleans it up). always:false and normal model invocation are used. It does not modify other skills or global agent settings.
Assessment
This skill appears to do what it claims, but it requires and manipulates sensitive OAuth tokens. Before installing: (1) Understand that you must create/maintain ~/.colab-mcp-auth-token.json (contains OAuth tokens and refresh tokens) and that inject_and_run.sh will embed a base64 copy of that token into a temporary script that is uploaded and executed on a Colab VM — that exposes the token to the remote runtime. (2) Use a dedicated Google account or GCP project if you want to limit blast radius, and restrict OAuth client credentials as much as possible. (3) Only run reauth_with_drive.py on a trusted machine/browser and ensure ~/colab-mcp-oauth-config.json is the genuine client config you expect. (4) Consider revoking the token after use or periodically rotating it. (5) Note the small mismatch: the skill advertises 'no browser needed' yet reauth may open a browser for initial Drive scope consent. (6) Review and confirm you trust any code you inject to Colab (templates include token placeholders and Drive access). If any of these steps are unacceptable, do not install or run the skill.

Like a lobster shell, security has layers — review code before you run it.

colabvk97ckayfpxgxxxv3rr2wnd12798358g6drivevk97ckayfpxgxxxv3rr2wnd12798358g6gpuvk97ckayfpxgxxxv3rr2wnd12798358g6latestvk97ckayfpxgxxxv3rr2wnd12798358g6trainingvk97ckayfpxgxxxv3rr2wnd12798358g6ttsvk97ckayfpxgxxxv3rr2wnd12798358g6
204downloads
0stars
4versions
Updated 1h ago
v1.3.0
MIT-0

Colab Skill

Execute Python on Google Colab GPU runtimes via headless API. No browser needed.

Setup

  1. Authenticate with Colab (one-time): Run the colab-mcp OAuth flow to create ~/.colab-mcp-auth-token.json. See https://github.com/googlecolab/colab-mcp
  2. Add Drive scope (optional, for persistence): scripts/reauth_with_drive.py
  3. Enable Drive API (if using Drive): https://console.developers.google.com/apis/api/drive.googleapis.com — enable for your GCP project
  4. Python deps: On first run, colab_run.py auto-creates a .colab-venv/ venv via uv and installs deps. Requires uv (install: pip install uv). Deps: google-auth-oauthlib, google-auth, jupyter-kernel-client, requests, google-api-python-client

Scripts

All scripts live in scripts/ and use a .colab-venv/ sibling directory for dependencies.

colab_run.py — Execute code on Colab

# Inline code on CPU
python3 scripts/colab_run.py exec "print('hello')"

# Script on T4 GPU
python3 scripts/colab_run.py exec --gpu T4 --file script.py

# Keep runtime alive between calls
python3 scripts/colab_run.py exec --gpu T4 --keep "x = 42"

# GPU options: T4 (default), L4, A100, H100
python3 scripts/colab_run.py exec --gpu A100 --file heavy_training.py

# Runtime management
python3 scripts/colab_run.py list          # Active runtimes
python3 scripts/colab_run.py stop <ep>     # Stop runtime
python3 scripts/colab_run.py info          # CU balance + eligible GPUs

colab_drive.py — Google Drive file transfer

python3 scripts/colab_drive.py upload file.pt --folder colab-workspace
python3 scripts/colab_drive.py download checkpoint.pt --output ./checkpoint.pt
python3 scripts/colab_drive.py list --folder colab-workspace

Requires drive.file OAuth scope (run scripts/reauth_with_drive.py once).

inject_and_run.sh — Run scripts with Drive access inside Colab

Injects the local OAuth token into a script before sending it to Colab. Place __COLAB_TOKEN_PLACEHOLDER__ where the base64 token should go:

bash scripts/inject_and_run.sh my_script.py --gpu T4

Inside the script:

import json, base64
from google.oauth2.credentials import Credentials
from googleapiclient.discovery import build

token_data = json.loads(base64.b64decode("__COLAB_TOKEN_PLACEHOLDER__"))
with open('/tmp/token.json', 'w') as f:
    json.dump(token_data, f)
creds = Credentials.from_authorized_user_file('/tmp/token.json')
service = build('drive', 'v3', credentials=creds)

colab_tts.py — Voice synthesis via F5-TTS

# One-time: fetch voice samples from ElevenLabs
python3 scripts/colab_tts.py fetch-voice --voice-id <id> --api-key <key>

# Generate speech with cloned voice
python3 scripts/colab_tts.py speak "Hello world" --output hello.wav --gpu T4

reauth_with_drive.py — Add Drive scope to OAuth token

python3 scripts/reauth_with_drive.py

Opens browser for Google sign-in. One-time setup.

GPU Selection Guide

GPUCU/hr$/hrVRAMUse for
T4~2$0.2015GBDefault. Inference, small training, TTS
L4~3.5$0.3524GBMedium models, need bf16 or >15GB
A100~5-15$0.50-1.5040/80GBLarge models, serious training
H100~20+$2.00+80GBMaximum throughput, time-critical

Start with T4. Escalate only when VRAM or speed demands it.

Patterns

Short task (inference, TTS, quick experiment)

Assign → execute → get output → unassign. Don't use --keep.

Multi-step session

Use --keep. State persists in kernel between exec calls. Unassign when done.

Long training with Drive checkpoints

Use inject_and_run.sh. Script checkpoints to Drive every N epochs. If runtime dies, re-run — script loads latest checkpoint from Drive. See references/examples.md for template.

Structured output

Embed parseable markers in script stdout:

print(f"__RESULT__{json.dumps(metrics)}")    # Structured data
print(f"__AUDIO__{base64_audio}")             # Binary as base64

Timeouts

  • Idle timeout (Pro): ~90 min without active execution
  • Max session (Pro): ~24 hours continuous
  • Active code execution prevents idle disconnect
  • Runtime death loses /tmp/ — use Drive for anything you need to keep

API Quirk

GPU assignment requires split GET/POST: GET fetches XSRF token (no GPU params), POST creates assignment with variant=GPU&accelerator=T4. Already handled by colab_run.py.

Comments

Loading comments...