Install
openclaw skills install knowledge-distillationDistill OpenClaw daily memory, session transcripts, and newly generated report files into new knowledge points and deeper knowledge leads. Use when the input is workspace-native materials such as MEMORY.md, memory/*.md, session logs, daily notes, summaries, or generated report files, and the goal is to extract (1) newly formed knowledge worth retaining and (2) promising knowledge threads worth further study. Output should be a dated Markdown file.
openclaw skills install knowledge-distillationThis skill is an OpenClaw internal knowledge distiller.
Its job is not to summarize everything. Its job is to scan agent-native working materials, identify what is newly learned, and separate that from what should be investigated, connected, or strengthened next.
Use this skill when the source materials come from the OpenClaw environment, especially:
MEMORY.mdmemory/*.mdTreat these as raw internal learning material.
From the input set, produce two things:
New Knowledge Points
Knowledge Leads Worth Deepening
Identify what each input contributes:
Do not treat all sources equally. Give more weight to repeated evidence across multiple sources.
Look for:
Prefer signal over chronology.
Promote something to New Knowledge Points only when at least one of these is true:
Keep something in Knowledge Leads Worth Deepening when:
Do not list near-duplicate observations separately.
Merge them upward into:
Each knowledge point should include a short basis such as:
Do not fabricate precision. Keep basis brief and honest.
For each deepen-able knowledge point, explain how to deepen it, for example:
The output must be a dated Markdown file.
Filename format:
knowledge-distillation-YYYY-MM-DD.mdIf multiple runs happen on the same day, use one of:
knowledge-distillation-YYYY-MM-DD-01.mdknowledge-distillation-YYYY-MM-DD-02.mdUse this structure unless the user explicitly asks for another one:
# Knowledge Distillation - YYYY-MM-DD
## Input Summary
- Memory files:
- Session/log sources:
- Report files:
## New Knowledge Points
### 1. Title
- Conclusion:
- Basis:
- Value:
- Scope:
### 2. Title
- Conclusion:
- Basis:
- Value:
- Scope:
## Knowledge Leads Worth Deepening
### 1. Title
- Current observation:
- Why worth deepening:
- Current gaps:
- Next step suggestions:
### 2. Title
- Current observation:
- Why worth deepening:
- Current gaps:
- Next step suggestions:
## Distillation Conclusions This Round
- Most worth retaining (1-3 points):
- Most worth tracking (1-3 leads):
For reusable variants, read references/output-templates.md.
Use this skill for requests like:
references/output-templates.md: dated Markdown output variants for standard runs, report-heavy runs, and follow-up runs