Obsidian Librarian

v0.2.7

Obsidian second-brain and knowledge-base skill. Save any URL, article, tweet, or X post to your Obsidian vault as clean, categorized, wikilinked markdown. Tw...

1· 111·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for shahalay007/obsidian-librarian.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Obsidian Librarian" (shahalay007/obsidian-librarian) from ClawHub.
Skill page: https://clawhub.ai/shahalay007/obsidian-librarian
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required env vars: GEMINI_API_KEY, OBSIDIAN_VAULT_PATH
Required binaries: python3, curl
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install obsidian-librarian

ClawHub CLI

Package manager switcher

npx clawhub@latest install obsidian-librarian
Security Scan
Capability signals
Requires sensitive credentials
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (Obsidian librarian) matches the code and runtime instructions: the skill reads/stages inputs, runs a two‑pass Gemini pipeline, resolves wikilinks against the vault index, writes markdown notes into the vault, and optionally indexes to Supabase. Required binaries (python3, curl) and required env vars (GEMINI_API_KEY, OBSIDIAN_VAULT_PATH) are appropriate for these operations.
Instruction Scope
SKILL.md and the scripts instruct the agent to read/write files inside the configured vault inbox and to call external services (Gemini for generation/embeddings, Apify for URL scraping). The instructions are focused on the stated task but do mandate 'Always use Apify' for URL fetches — that routes fetched page content through an external service. URL ingestion will not be local unless you avoid that path.
Install Mechanism
No install spec or remote download is present; the skill is shipped as Python scripts and run via python3. No high‑risk install steps (no external archives or shortener URLs) were observed.
Credentials
Primary credential GEMINI_API_KEY is reasonable and required for both ingest and RAG. OBSIDIAN_VAULT_PATH is required and justified. APIFY_API_KEY and SUPABASE_* are optional for URL ingestion and Supabase-backed RAG respectively — they are not required by default but will enable external transmission/storage of data if provided. The number and type of env vars requested are proportionate to the feature set, but supplying optional keys enables external data flow.
Persistence & Privilege
Skill does not request always:true or any elevated platform privilege. It writes and deletes files only within the configured vault/inbox and its own index path (or to your Supabase if configured). It does not modify other skills or global agent settings.
Assessment
This skill appears to do what it says: it will write cleaned markdown notes into the Obsidian vault path you supply and use Gemini for processing and RAG. Before installing, consider: 1) The skill will send content to Gemini (GEMINI_API_KEY) for both text synthesis and embeddings — only provide a key you trust and understand the provider's data usage. 2) URL ingestion is routed through Apify by design (APIFY_API_KEY required for that path), so any web pages you capture will be processed by Apify; avoid providing APIFY_API_KEY if you want strictly local fetching. 3) Supabase storage is optional but will store embeddings/content on the supplied SUPABASE_URL if enabled — only supply credentials for databases you control. 4) The skill will write and (by default) delete staged inbox files inside your vault; ensure OBSIDIAN_VAULT_PATH points to a containerized or intended location. 5) If you need higher privacy, avoid providing APIFY or SUPABASE credentials and review/host the scripts yourself. If you want more assurance, run the Python files in a sandbox and inspect vault outputs and network traffic during a test ingest.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

Binspython3, curl
EnvGEMINI_API_KEY, OBSIDIAN_VAULT_PATH
Primary envGEMINI_API_KEY
latestvk974v9891k5arr3z4sqnsrb1xd8561d8
111downloads
1stars
6versions
Updated 1w ago
v0.2.7
MIT-0

Obsidian Librarian

A second brain for Obsidian, on autopilot. Drop any URL, article, tweet, X post, or pasted text into OpenClaw and it lands in your vault as a clean, categorized, wikilinked markdown note. Then ask your whole vault anything and get grounded answers with citations.

Use this skill when the user wants OpenClaw to store text or a URL in the Obsidian vault as a cleaned, categorized markdown note, or to search and query notes they have already saved.

Trigger shortcuts:

  • Treat save this, save it, save this url, and save this link as Obsidian-librarian requests when the same message contains a URL, pasted text, or quoted content to preserve.
  • Treat short follow-ups like save it as Obsidian-librarian requests when the immediately preceding user message provided the text or URL to store.
  • Treat phrases like search my notes, search my vault, search Obsidian, what do my notes say about ..., ask my vault, and query my saved notes as Obsidian-librarian requests that should run the RAG ask path.
  • If the message only says save this or save it with no actual content or URL available in context, do not guess; ask what should be saved.
  • If the intent is ambiguous between saving to the local filesystem versus saving to the knowledge vault, prefer the Obsidian vault when the content looks like a note, article, research snippet, or social post.

The vault is mounted in the container at /data/.openclaw/obsidian-vault. Raw inputs are staged in /data/.openclaw/obsidian-vault/_Inbox, then processed into category folders.

Environment

Required:

  • GEMINI_API_KEY: Gemini API key used for both ingest and RAG answer generation.
  • OBSIDIAN_VAULT_PATH: Absolute path to the mounted Obsidian vault.

Conditional:

  • APIFY_API_KEY: Required for URL ingestion.

Optional:

  • OBSIDIAN_INBOX_FOLDER: Override the inbox folder name. Default: _Inbox.
  • OBSIDIAN_GEMINI_MODEL: Primary model override for librarian operations.
  • GEMINI_MODEL: Fallback model name when OBSIDIAN_GEMINI_MODEL is unset.
  • OBSIDIAN_RAG_INDEX_PATH: Override the local JSON RAG index path.
  • SUPABASE_URL: Enable Supabase-backed vector storage.
  • SUPABASE_KEY: Supabase API key for vector storage.
  • EMBEDDING_MODEL: Embedding model override. Default: gemini-embedding-001.
  • EMBEDDING_DIMENSIONS: Embedding size. Default: 384.

URL handling policy:

  • Always use Apify to read the URL first.
  • For x.com / twitter.com post URLs, use the dedicated Apify tweet actor.
  • If an X post contains linked URLs, follow those linked URLs through the same Apify-first path before falling back.
  • If direct URL reading fails, run a web-search fallback and stage the search-result snapshot instead.
  • If both stages fail, surface the full error back to OpenClaw instead of silently swallowing it.

Supported Inputs

  • Pasted text
  • A local text/markdown file
  • A blog/article URL
  • An existing file already sitting in _Inbox
  • A natural-language question about the saved vault

Workflow

  1. Stage the raw source in _Inbox/.
  2. Run Gemini pass 1 to clean and structure it into markdown.
  3. Run Gemini pass 2 to choose category, tags, source attribution, and candidate wikilinks.
  4. Scan existing vault notes for titles and aliases to resolve [[wikilinks]].
  5. Write the final note with YAML frontmatter into the chosen category folder.
  6. Delete the _Inbox file only after the final note is written successfully.

Ingest From Text File

python3 {baseDir}/scripts/run_pipeline.py ingest --text-file /data/.openclaw/workspace/input.txt

Ingest From URL

python3 {baseDir}/scripts/run_pipeline.py ingest --url "https://example.com/article"

Ingest An Existing Inbox File

python3 {baseDir}/scripts/run_pipeline.py ingest --inbox-file /data/.openclaw/obsidian-vault/_Inbox/some-file.md

Ask The Vault (RAG)

python3 {baseDir}/scripts/run_pipeline.py --vault-path /data/.openclaw/obsidian-vault ask "What do my notes say about AI agents?" --print-json

Optional flags: --category <Category>, --threshold <float> (default 0.65), --limit <N> (default 5).

Reindex The Vault

python3 {baseDir}/scripts/run_pipeline.py --vault-path /data/.openclaw/obsidian-vault reindex

Add --file <path> to re-embed a single note instead of the full vault.

Notes

  • For long pasted text, prefer writing it to a temp file under /data/.openclaw/workspace/ and using ingest --text-file.
  • Use --title "Custom Title" on ingest for an explicit note title override.
  • Use --keep-inbox only when debugging. Normal behavior is to clean up the staged source after success.
  • X status URLs preserve deterministic post metadata and captured post content instead of relying on a generic article-style rewrite.
  • The pipeline does forward-linking only in v1. Existing notes are not modified.
  • URL ingestion requires APIFY_API_KEY in the container environment.
  • RAG indexing runs after successful ingests. By default it uses a local JSON index; set SUPABASE_URL and SUPABASE_KEY to use Supabase pgvector instead (requires EMBEDDING_DIMENSIONS=384 to match sql/vault_chunks.sql).
  • SUPABASE_URL must point at a Supabase-compatible API surface. All requests are issued against /rest/v1/..., so self-hosted PostgREST needs a gateway or reverse proxy that serves that prefix.
  • Before enabling Supabase, apply sql/vault_chunks.sql to the target database. It provisions the vault_chunks table, the HNSW index, and the match_vault_chunks RPC that the ask command calls.

Comments

Loading comments...