Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

opencode-session-toolkit

v1.0.0

Read the local OpenCode SQLite database, run cross-directory session queries, and export sessions to Markdown files.

0· 87·0 current·0 all-time
byWu Fei@wufei-png

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for wufei-png/opencode-session-toolkit.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "opencode-session-toolkit" (wufei-png/opencode-session-toolkit) from ClawHub.
Skill page: https://clawhub.ai/wufei-png/opencode-session-toolkit
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install opencode-session-toolkit

ClawHub CLI

Package manager switcher

npx clawhub@latest install opencode-session-toolkit
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
!
Purpose & Capability
The skill's stated purpose (read OpenCode DB and export sessions) matches the included code and SKILL.md. However, the registry metadata claims no required binaries or environment variables while the SKILL.md and the script rely on the local `opencode` CLI to resolve the DB path and the examples rely on `sqlite3`, `column`, and standard shell utilities; that mismatch is unexpected and may cause runtime failures or hidden dependencies.
Instruction Scope
The runtime instructions and the bundled Python script stay within the stated scope: they resolve a local DB path, open the database read-only, run queries, and write Markdown files. There are no network endpoints, secret exfiltration, or commands that read unrelated system configuration. The SKILL.md does include examples that search message JSON (which may contain sensitive session content) — expected for this purpose but worth noting.
Install Mechanism
There is no install spec (instruction-only plus a bundled Python script). The script is pure-Python and uses stdlib modules; nothing is downloaded from external URLs. This is a low-risk install model. One oddity: the script shebang uses "#!/usr/bin/env -S uv run --script" which is unusual and may not work on many systems; the SKILL.md recommends running the script directly with a Python interpreter.
!
Credentials
No credentials or secrets are requested (good). But the skill implicitly depends on local tools and env vars (opencode CLI, possibly sqlite3, XDG_DATA_HOME/HOME) while the registry metadata lists none — an omission that reduces transparency. The script itself does not access environment variables beyond standard XDG/HOME resolution and does not transmit data externally.
Persistence & Privilege
The skill is not marked always:true and does not request system-wide persistence or modify other skills. The agent config (agents/openai.yaml) allows implicit invocation, which is common and expected; this is not a standalone red flag here.
What to consider before installing
This skill appears to implement exactly what it says (read a local OpenCode SQLite DB and export sessions), but verify a few things before you install/use it: 1) The SKILL.md and script call the local `opencode` CLI to resolve the DB path (and examples use `sqlite3`, `column`, `date`) — make sure those tools are present and trustworthy; the registry metadata failing to list them is an omission. 2) The script opens the DB read-only (good), but exported sessions may contain sensitive chat/message contents — review the output location and use filters (or avoid --all) if you don't want to export everything. 3) The script requires Python 3.11+ per its header and the shebang is unusual; run it explicitly with a known Python interpreter (python3.11 ./scripts/export_opencode_sessions.py --output-dir <dir> [filters]). 4) If you want higher assurance, inspect the included script (scripts/export_opencode_sessions.py) yourself and test it in a safe/isolated environment. If the metadata were corrected to list the external CLI dependencies and clarify the Python requirement, the coherence concerns would be resolved.

Like a lobster shell, security has layers — review code before you run it.

latestvk97e27j0meq11427jqcmy5ya7x83hh1k
87downloads
0stars
1versions
Updated 1mo ago
v1.0.0
MIT-0

OpenCode Session Toolkit

Read the local OpenCode SQLite database and query or export sessions, messages, parts, and projects across directories.

All commands below assume the workdir is this skill directory. For Markdown export, run the bundled script directly:

./scripts/export_opencode_sessions.py --help

When to use

  • List recent sessions, filter by directory, or search by title
  • Read message JSON for a specific session
  • Export matched sessions into one-Markdown-per-session archives
  • Inspect database schema and indexes (load references/schema.md only when needed)

Workflow

  1. Resolve the database path with opencode db path.
  2. Run all queries in read-only mode.
  3. Load references/schema.md only when field-level details are required.

1. Resolve the database path

if ! command -v opencode >/dev/null 2>&1; then
  echo "opencode command not found in PATH" >&2
  exit 1
fi

if ! DB_PATH="$(opencode db path 2>/dev/null)"; then
  echo "Failed to resolve OpenCode DB path via: opencode db path" >&2
  exit 1
fi

if [ -z "${DB_PATH:-}" ] || [ ! -f "$DB_PATH" ]; then
  echo "OpenCode DB not found: $DB_PATH" >&2
  exit 1
fi

echo "Using DB: $DB_PATH"

List existing DB files (no error when there is no match):

find "${XDG_DATA_HOME:-$HOME/.local/share}/opencode" -maxdepth 1 -name '*.db' -print 2>/dev/null

2. Time conversion and output formatting

Time conversion: all time fields are Unix timestamps in milliseconds. Convert them directly in SQL with datetime().

# Convert in SQL (recommended, no external command needed)
datetime(time_updated/1000, 'unixepoch', 'localtime')

# Shell helpers for time windows
NOW_MS=$(date +%s000)
LAST_7D=$((NOW_MS - 7*86400*1000))
LAST_30D=$((NOW_MS - 30*86400*1000))

Table alignment: for normal fields, pipe SQLite output to column -t -s '|' (| is SQLite's default delimiter). For long JSON fields such as message.data, prefer -json output.

sqlite3 -readonly "$DB_PATH" "SELECT id, title, time_updated FROM session LIMIT 5;" | column -t -s '|'

3. Common read-only queries

Tip: For queries without large JSON fields, append | column -t -s '|' for aligned table output.

List the latest 20 sessions (most recently updated first)

sqlite3 -readonly "$DB_PATH" \
  "SELECT id, title, directory,
          datetime(time_updated/1000,'unixepoch','localtime') as updated
   FROM session
   ORDER BY time_updated DESC
   LIMIT 20;" | column -t -s '|'

Filter sessions by directory

sqlite3 -readonly "$DB_PATH" \
  "SELECT id, title, datetime(time_updated/1000,'unixepoch','localtime') as updated
   FROM session
   WHERE directory LIKE '/path/to/project%'
   ORDER BY time_updated DESC
   LIMIT 20;" | column -t -s '|'

Filter sessions by project_id (most precise project linkage)

sqlite3 -readonly "$DB_PATH" \
  "SELECT s.id, s.title, s.directory,
          datetime(s.time_updated/1000,'unixepoch','localtime') as updated
   FROM session s
   WHERE s.project_id = 'your-project-id'
   ORDER BY s.time_updated DESC
   LIMIT 20;" | column -t -s '|'

project_id maps to project.id. List projects with:

sqlite3 -readonly "$DB_PATH" "SELECT id, worktree, name FROM project;" | column -t -s '|'

List sessions across all directories (with project info)

sqlite3 -readonly "$DB_PATH" \
  "SELECT s.id, s.title, s.directory, p.worktree,
          datetime(s.time_updated/1000,'unixepoch','localtime') as updated
   FROM session s
   LEFT JOIN project p ON s.project_id = p.id
   ORDER BY s.time_updated DESC
   LIMIT 50;" | column -t -s '|'

Filter by time range

# Sessions active in the last 7 days
sqlite3 -readonly "$DB_PATH" \
  "SELECT id, title, datetime(time_updated/1000,'unixepoch','localtime') as updated
   FROM session
   WHERE time_updated > $(( $(date +%s000) - 7*86400*1000 ))
   ORDER BY time_updated DESC
   LIMIT 20;" | column -t -s '|'

# Sessions created today (local time)
sqlite3 -readonly "$DB_PATH" \
  "SELECT id, title, datetime(time_created/1000,'unixepoch','localtime') as created
   FROM session
   WHERE date(time_created/1000,'unixepoch','localtime') = date('now','localtime')
   ORDER BY time_created DESC
   LIMIT 20;" | column -t -s '|'

Read message content for one session

sqlite3 -readonly -json "$DB_PATH" \
  "SELECT m.id, datetime(m.time_created/1000,'unixepoch','localtime') as created, m.data
   FROM message m
   WHERE m.session_id = 'your-session-id'
   ORDER BY m.time_created ASC;"

Extract fields from message.data JSON

# Extract key fields such as role and modelID
sqlite3 -readonly "$DB_PATH" \
  "SELECT id,
          json_extract(data, '$.role') as role,
          json_extract(data, '$.modelID') as model,
          datetime(time_created/1000,'unixepoch','localtime') as created
   FROM message
   WHERE session_id = 'your-session-id'
   ORDER BY time_created ASC;" | column -t -s '|'

# Search message payload text with LIKE
sqlite3 -readonly "$DB_PATH" \
  "SELECT id, json_extract(data, '$.role') as role, time_created
   FROM message
   WHERE data LIKE '%keyword%'
   ORDER BY time_created DESC
   LIMIT 20;" | column -t -s '|'

Search session titles

sqlite3 -readonly "$DB_PATH" \
  "SELECT id, title, directory, datetime(time_updated/1000,'unixepoch','localtime') as updated
   FROM session
   WHERE title LIKE '%keyword%'
   ORDER BY time_updated DESC
   LIMIT 20;" | column -t -s '|'

View session summary stats

sqlite3 -readonly "$DB_PATH" \
  "SELECT title, summary_additions, summary_deletions, summary_files,
          datetime(time_created/1000,'unixepoch','localtime') as created
   FROM session
   ORDER BY time_updated DESC
   LIMIT 20;" | column -t -s '|'

4. Export sessions to Markdown

The export script writes one session per Markdown file. By default:

  • filename = session title + created time
  • time filtering uses time_updated unless --time-field created is passed
  • step-start / step-finish parts are skipped to reduce noise
  • when project.name is empty, project folder names fall back to the worktree basename, or global

Export sessions for one project

./scripts/export_opencode_sessions.py \
  --project opencode-session-toolkit \
  --output-dir ./exports/opencode-session-toolkit

--project matches by substring against project_id, project.name, project.worktree, and session.directory.

Export sessions in a time range

./scripts/export_opencode_sessions.py \
  --start 2026-03-01 \
  --end 2026-03-24T23:59:59 \
  --time-field updated \
  --output-dir ./exports/march

Accepted time formats:

  • ISO date: 2026-03-24
  • ISO datetime: 2026-03-24T22:35:37
  • Unix seconds / milliseconds

Full export grouped by project

./scripts/export_opencode_sessions.py \
  --all \
  --group-by-project \
  --output-dir ./exports/all

Output example:

exports/all/
  OrchAI/
    Migration work planning with subagent discussion_2026-03-23_23-48-07.md
  global/
    opencode-session-toolkit 命令验证与优化_2026-03-24_22-35-37.md

Useful extra filters

  • --session-id ses_xxx: exact session export
  • --title-contains keyword: match session titles
  • --directory-contains keyword: match session directories
  • --archived include|exclude|only: filter archived sessions
  • --filename-time-field created|updated: choose which session time goes into the filename
  • --include-part-type text --include-part-type tool: export only certain part types
  • --exclude-part-type reasoning: drop noisy part types
  • --overwrite: overwrite existing files instead of appending the session id to avoid collisions

If no filters are provided, the script requires --all to avoid accidental full-database exports.

5. Inspect schema

sqlite3 -readonly "$DB_PATH" ".schema session"
sqlite3 -readonly "$DB_PATH" ".schema message"
sqlite3 -readonly "$DB_PATH" ".schema part"
sqlite3 -readonly "$DB_PATH" ".schema project"

For complete field and index notes, see references/schema.md.

6. List all tables

sqlite3 -readonly "$DB_PATH" ".tables"

7. Example output

id          title                     directory                   updated
----------  -----------------------  --------------------------  -------------------
ses_abc123  My Session - 2026-03-24  /home/user/project         2026-03-24 10:00:00
ses_def456  Another Session          /home/user/other           2026-03-23 15:30:00

(Aligned with | column -t -s '|'.)

8. Notes

  • OpenCode uses SQLite WAL mode, so .db-wal and .db-shm files are expected.
  • Time fields are Unix timestamps in milliseconds. Convert with datetime(ts/1000,'unixepoch','localtime').
  • data fields are JSON. Use json_extract(data, '$.field') for structured extraction, and prefer sqlite3 -json for raw message inspection.
  • Session isolation is anchored by project_id; for cross-directory queries, joining project.worktree is recommended.
  • Direct writes can corrupt data. Back up before any non-read-only operation.
  • account and control_account tables may contain sensitive credentials. Redact outputs when sharing.

Comments

Loading comments...