Skill Exporter
Export Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration.
MIT-0 · Free to use, modify, and redistribute. No attribution required.
⭐ 1 · 1.7k · 2 current installs · 2 all-time installs
MIT-0
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
Name/description match what the files do: scripts/export.py reads a local skill directory and emits Dockerfile, FastAPI wrapper, requirements, llm client stubs, and copies the skill scripts. Required binary (python3) is appropriate and no unrelated external credentials or binaries are requested.
Instruction Scope
SKILL.md instructs running export.py against a local skill path; export.py reads SKILL.md, scripts/, and .env (if present) from the source skill to detect metadata, env var names, and imports. It does not appear to execute or import the target scripts, but it will copy them into the output service. The generated api.py template allows CORS for all origins and includes subprocess usage — acceptable for a generic wrapper but worth reviewing if you plan to deploy publicly.
Install Mechanism
No install spec — this is instruction + a generator script. There are no downloads or extracted archives in the skill itself, so there is no installer risk from this package.
Credentials
The exporter itself doesn't require any credentials. When you request LLM integration (--llm), the generated llm_client files expect standard provider API keys (ANTHROPIC_API_KEY / OPENAI_API_KEY) in the runtime environment — this is proportional to optional LLM functionality. The exporter scans for env var NAMES in the source skill and .env to populate .env.example, but it does not appear to exfiltrate secret VALUES.
Persistence & Privilege
always is false and the skill doesn't request persistent platform privileges. The tool writes files to the chosen output directory and marks copied scripts executable; it does not modify other skills or global agent configuration.
Assessment
This exporter appears to do what it says, but review the generated service before deploying. Specific suggestions: (1) inspect generated api.py for the wide-open CORS policy and adjust or add authentication if the service will be public; (2) review any subprocess calls and copied scripts to ensure they don't run unsafe commands or expect secrets embedded in code; (3) the exporter will detect and surface environment variable NAMES from the source skill and create a .env.example — do not put real secret values into the source skill's .env when running the exporter (work on a sanitized copy) and do not commit .env with secrets to source control; (4) if you enable --llm you will need to provide provider API keys at runtime — only supply keys you trust and scope them appropriately; and (5) test the generated docker image locally before deploying to Railway/Fly or exposing it to the internet.Like a lobster shell, security has layers — review code before you run it.
Current versionv1.0.0
Download zipapiautomationdeploydockerexportfastapiflyiolatestmicroservicerailwaystandalone
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
Runtime requirements
📦 Clawdis
Binspython3
SKILL.md
Skill Exporter
Transform Clawdbot skills into standalone, deployable microservices.
Workflow
Clawdbot Skill (tested & working)
↓
skill-exporter
↓
Standalone Microservice
↓
Railway / Fly.io / Docker
Usage
Export a skill
python3 {baseDir}/scripts/export.py \
--skill ~/.clawdbot/skills/instagram \
--target railway \
--llm anthropic \
--output ~/projects/instagram-service
Options
| Flag | Description | Default |
|---|---|---|
--skill | Path to skill directory | required |
--target | Deployment target: railway, fly, docker | docker |
--llm | LLM provider: anthropic, openai, none | none |
--output | Output directory | ./<skill-name>-service |
--port | API port | 8000 |
Targets
railway — Generates railway.json, optimized Dockerfile, health checks
fly — Generates fly.toml, multi-region ready
docker — Generic Dockerfile, docker-compose.yml
LLM Integration
When --llm is set, generates llm_client.py with:
- Caption/prompt generation
- Decision making helpers
- Rate limiting and error handling
What Gets Generated
<skill>-service/
├── Dockerfile
├── docker-compose.yml
├── api.py # FastAPI wrapper
├── llm_client.py # If --llm specified
├── requirements.txt
├── .env.example
├── railway.json # If --target railway
├── fly.toml # If --target fly
└── scripts/ # Copied from original skill
└── *.py
Requirements
The source skill must have:
SKILL.mdwith valid frontmatter- At least one script in
scripts/ - Scripts should be callable (functions, not just inline code)
Post-Export
- Copy
.env.exampleto.envand fill in secrets - Test locally:
docker-compose up - Deploy:
railway uporfly deploy
Files
2 totalSelect a file
Select a file to preview.
Comments
Loading comments…
