Install
openclaw skills install skill-exporterExport Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration.
openclaw skills install skill-exporterTransform Clawdbot skills into standalone, deployable microservices.
Clawdbot Skill (tested & working)
↓
skill-exporter
↓
Standalone Microservice
↓
Railway / Fly.io / Docker
python3 {baseDir}/scripts/export.py \
--skill ~/.clawdbot/skills/instagram \
--target railway \
--llm anthropic \
--output ~/projects/instagram-service
| Flag | Description | Default |
|---|---|---|
--skill | Path to skill directory | required |
--target | Deployment target: railway, fly, docker | docker |
--llm | LLM provider: anthropic, openai, none | none |
--output | Output directory | ./<skill-name>-service |
--port | API port | 8000 |
railway — Generates railway.json, optimized Dockerfile, health checks
fly — Generates fly.toml, multi-region ready
docker — Generic Dockerfile, docker-compose.yml
When --llm is set, generates llm_client.py with:
<skill>-service/
├── Dockerfile
├── docker-compose.yml
├── api.py # FastAPI wrapper
├── llm_client.py # If --llm specified
├── requirements.txt
├── .env.example
├── railway.json # If --target railway
├── fly.toml # If --target fly
└── scripts/ # Copied from original skill
└── *.py
The source skill must have:
SKILL.md with valid frontmatterscripts/.env.example to .env and fill in secretsdocker-compose uprailway up or fly deploy