DevOps & CloudDocumentedScanned

skill-exporter

Export Clawdbot skills as standalone, deployable microservices.

Share:

Installation

npx clawhub@latest install skill-exporter

View the full skill documentation and source below.

Documentation

Skill Exporter

Transform Clawdbot skills into standalone, deployable microservices.

Workflow

Clawdbot Skill (tested & working)
         ↓
    skill-exporter
         ↓
Standalone Microservice
         ↓
Railway / Fly.io / Docker

Usage

Export a skill

python3 {baseDir}/scripts/export.py \
  --skill ~/.clawdbot/skills/instagram \
  --target railway \
  --llm anthropic \
  --output ~/projects/instagram-service

Options

FlagDescriptionDefault
--skillPath to skill directoryrequired
--targetDeployment target: railway, fly, dockerdocker
--llmLLM provider: anthropic, openai, nonenone
--outputOutput directory./-service
--portAPI port8000

Targets

railway — Generates railway.json, optimized Dockerfile, health checks
fly — Generates fly.toml, multi-region ready
docker — Generic Dockerfile, docker-compose.yml

LLM Integration

When --llm is set, generates llm_client.py with:

  • Caption/prompt generation

  • Decision making helpers

  • Rate limiting and error handling


What Gets Generated

<skill>-service/
├── Dockerfile
├── docker-compose.yml
├── api.py              # FastAPI wrapper
├── llm_client.py       # If --llm specified
├── requirements.txt
├── .env.example
├── railway.json        # If --target railway
├── fly.toml            # If --target fly
└── scripts/            # Copied from original skill
    └── *.py

Requirements

The source skill must have:

  • SKILL.md with valid frontmatter

  • At least one script in scripts/

  • Scripts should be callable (functions, not just inline code)


Post-Export

  • Copy .env.example to .env and fill in secrets

  • Test locally: docker-compose up

  • Deploy: railway up or fly deploy