Skill Distiller (One-Liner)

v0.2.1

Skill compression reminder in 100 tokens — just trigger, action, result.

0· 79·0 current·0 all-time
byLee Brown@leegitw

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for leegitw/neon-skill-distiller-oneliner.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Skill Distiller (One-Liner)" (leegitw/neon-skill-distiller-oneliner) from ClawHub.
Skill page: https://clawhub.ai/leegitw/neon-skill-distiller-oneliner
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install neon-skill-distiller-oneliner

ClawHub CLI

Package manager switcher

npx clawhub@latest install neon-skill-distiller-oneliner
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (skill compression to ~100 tokens) align with the SKILL.md: there are no extra binaries, env vars, or installs requested that would be unrelated to simple text processing.
Instruction Scope
SKILL.md gives high-level instructions to parse and remove low-value sections using an LLM and to preserve certain YAML patterns. The instructions are scoped to skill markdown content and do not ask for system files or credentials, but they are terse and grant broad discretion to the LLM about what to remove — outputs should be reviewed before use.
Install Mechanism
No install spec and no code files — lowest-risk model: nothing is written to disk or downloaded by the skill itself.
Credentials
The skill requests no environment variables, credentials, or config paths; this is proportional for a purely text-compression helper.
Persistence & Privilege
always is false and disable-model-invocation is true, so the skill cannot autonomously run and is only user-invocable, reducing risk of unexpected action.
Assessment
This skill appears internally consistent and low-risk. Before using it broadly, (1) run it against non-critical skill markdown to confirm it preserves required fields, (2) keep a backup of the original skill text (so nothing is irrevocably lost), and (3) review the distilled output before deploying — the instructions give the LLM latitude to delete sections, so human review prevents accidental loss of important behavior or permissions.

Like a lobster shell, security has layers — review code before you run it.

compressionvk97ehzh2te2sa93dva92qw4j0s84wawzcontext-windowvk97ehzh2te2sa93dva92qw4j0s84wawzlatestvk97ehzh2te2sa93dva92qw4j0s84wawzminimalvk97ehzh2te2sa93dva92qw4j0s84wawzopenclawvk97ehzh2te2sa93dva92qw4j0s84wawzoptimizationvk97ehzh2te2sa93dva92qw4j0s84wawzquick-referencevk97ehzh2te2sa93dva92qw4j0s84wawzskillsvk97ehzh2te2sa93dva92qw4j0s84wawztoken-reductionvk97ehzh2te2sa93dva92qw4j0s84wawz
79downloads
0stars
3versions
Updated 1w ago
v0.2.1
MIT-0

Skill Distiller (One-Liner)

Minimal reference variant (~100 tokens, ~70% functionality, LLM-estimated). Full reference: ../SKILL.reference.md.

TRIGGER: User asks to compress, distill, or reduce a skill's context usage

ACTION: Parse skill into sections (TRIGGER/INSTRUCTION/EXAMPLE/etc), score importance via LLM, remove low-value sections while preserving protected patterns (YAML name/description, N-count tracking, task creation)

RESULT: Compressed skill markdown with functionality score (0-100%), token reduction stats, and list of removed/kept sections


Related

VariantTokensFunctionality
skill-distiller (main)~400~90% (formula)
compressed~975~90% (prose)
oneliner (this)~100~70%

Full reference: SKILL.reference.md (~2,500 tokens, ~90%)

Functionality scores are LLM-estimated.

Comments

Loading comments...