Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Memora - Personal Knowledge Base (RAG)

v1.1.0

Memora — A self-hosted RAG (Retrieval-Augmented Generation) personal knowledge base. Built with FastAPI + Qdrant + DashScope/OpenAI Embedding + DeepSeek/Open...

1· 114·0 current·0 all-time
byProbieren@zzlzzlzzl15

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for zzlzzlzzl15/memora-knowledge-base.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Memora - Personal Knowledge Base (RAG)" (zzlzzlzzl15/memora-knowledge-base) from ClawHub.
Skill page: https://clawhub.ai/zzlzzlzzl15/memora-knowledge-base
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required env vars: KB_API_BASE
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install memora-knowledge-base

ClawHub CLI

Package manager switcher

npx clawhub@latest install memora-knowledge-base
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (personal RAG knowledge base) aligns with the provided artifacts. The only required environment variable is KB_API_BASE (the backend URL) which is appropriate for a client that makes HTTP calls to a Memora backend. The included Python client (scripts/kb_api.py) performs search, upload, create, list, and detail operations that match the described features.
Instruction Scope
Runtime instructions tell the agent to run the included Python client with commands like upload/create/search. The client reads a file path when performing uploads and sends the file contents to KB_API_BASE: this is expected behavior for a document ingest feature, but it means the agent (or user prompts) can cause arbitrary local files to be read and transmitted to the configured backend. Ensure uploads are limited to intended files and that KB_API_BASE points to a trusted service.
Install Mechanism
There is no install spec (instruction-only skill) and only a small stdlib-only Python script is included. Nothing is downloaded at install time and no external packages are required by the client, which keeps install risk low.
Credentials
The skill only requires KB_API_BASE. This is proportional to a client that must know where the Memora backend lives. No unrelated secrets, tokens, or config paths are requested.
Persistence & Privilege
The skill does not request always:true, does not modify other skills, and does not require system-wide configuration changes. It can be invoked autonomously (platform default), which is expected for skills of this type.
Assessment
This skill appears coherent and matches its description, but be mindful of two practical risks: (1) Uploads will read the local file path you provide and POST its contents to the server at KB_API_BASE — do not upload sensitive files unless you absolutely trust that backend. (2) KB_API_BASE can be any URL; if you set it to a remote/untrusted endpoint, the service will receive your queries, documents, and returned context. Prefer running the Memora backend locally (KB_API_BASE=http://127.0.0.1:8080) or on a trusted host, review the backend's source (SKILL.md links a GitHub repo) before connecting to unknown endpoints, and avoid giving the skill any credentials or backend URL that you wouldn't trust with your documents.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

EnvKB_API_BASE
aivk97c0p9qj336c4rq4bfa6947xn83ss99fastapivk97c0p9qj336c4rq4bfa6947xn83ss99knowledge-basevk97c0p9qj336c4rq4bfa6947xn83ss99latestvk97c0p9qj336c4rq4bfa6947xn83ss99memoravk97c0p9qj336c4rq4bfa6947xn83ss99qdrantvk97c0p9qj336c4rq4bfa6947xn83ss99ragvk97c0p9qj336c4rq4bfa6947xn83ss99semantic-searchvk97c0p9qj336c4rq4bfa6947xn83ss99
114downloads
1stars
2versions
Updated 1mo ago
v1.1.0
MIT-0

Memora — Personal Knowledge Base (RAG)

A self-hosted Retrieval-Augmented Generation (RAG) personal knowledge base that lets your AI assistant search, query, and manage your private documents.

Tech Stack

  • Backend: FastAPI (Python)
  • Vector Database: Qdrant (dense + sparse vectors)
  • Embedding: DashScope text-embedding-v4 / OpenAI compatible
  • LLM: DeepSeek / OpenAI compatible
  • Retrieval: Hybrid search (dense vectors + BM42 sparse vectors + Qwen3 Rerank)
  • Metadata Store: MySQL
  • Skill Client: Zero-dependency Python (stdlib only — urllib, json)

Features

  • Semantic Search — Find documents by meaning using vector similarity, not just keywords
  • AI-Powered Q&A — Ask a question, get an LLM-generated answer grounded in your documents with source citations
  • Hybrid Retrieval — Dense embedding + BM42 sparse vectors + reranking for optimal recall and precision
  • Document Upload — Ingest PDF, DOCX, TXT, and Markdown files with automatic chunking and vectorization
  • Document Creation — Create text documents directly from the agent
  • Document Management — List, view details, and organize your knowledge base

When to Run

  • User asks a question that may be answered by stored documents
  • User wants to search the knowledge base
  • User wants to list documents or view document details
  • User wants to upload a file or create a new document
  • User needs AI-organized answers on a topic from their personal knowledge

Workflow

Upload a File

  1. Get the file path and title from the user
  2. Run:
    python scripts/kb_api.py upload "{absolute_file_path}" "{document_title}"
    
  3. Supported formats: .txt .pdf .docx .md
  4. Returns upload result with document_id

Create a Text Document

  1. Get the title and text content from the user
  2. Run:
    python scripts/kb_api.py create "{title}" "{content}"
    
  3. Returns creation result with document_id

Search with AI Answer (RAG)

  1. Extract the user's query
  2. Run:
    python scripts/kb_api.py search_answer "{query}"
    
  3. Parse the returned JSON: extract answer and source documents from sources
  4. Present the answer with source citations

Search Documents Only

  1. Extract the user's search keywords
  2. Run:
    python scripts/kb_api.py search "{keywords}"
    
  3. Parse and display the ranked search results

List All Documents

  1. Run:
    python scripts/kb_api.py list
    
  2. Display the document list

View Document Details

  1. Get the document ID
  2. Run:
    python scripts/kb_api.py detail "{document_id}"
    
  3. Display the document content

Output Format

Upload / Create:

Document "{title}" has been added to the knowledge base (ID: {document_id})

Search with AI Answer:

Knowledge Base Query Result

{AI-generated answer based on retrieved documents}

Sources:

  • {document_title} (relevance: {score})

List Documents:

Documents ({n} total)

  1. {title} — {created_at}
  2. ...

Configuration

Set the environment variable KB_API_BASE to point to the Memora backend. Default: http://127.0.0.1:8080

Source code & setup guide: https://github.com/zzlzzlzzl15/Memora

Comments

Loading comments...