Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Harbor — Curated and shared Memory for AI Agents

v0.4.11

Persistent cross-session memory, credential isolation, and schema learning for your OpenClaw agent. Stores data locally at ~/.harbor/ (memory, encrypted keyc...

0· 229·0 current·0 all-time
byJiaxi@zx13719

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for zx13719/harbor.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Harbor — Curated and shared Memory for AI Agents" (zx13719/harbor) from ClawHub.
Skill page: https://clawhub.ai/zx13719/harbor
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required binaries: harbor
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install harbor

ClawHub CLI

Package manager switcher

npx clawhub@latest install harbor
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (persistent memory, credential isolation) match what the skill requires: a 'harbor' binary, access to ~/.harbor/, and OS keychain. Declared network endpoints (harbor-cloud.*) align with the described optional cloud sync feature.
Instruction Scope
SKILL.md instructs the agent to register Harbor as an MCP tool and to route API calls and memory operations through it. That scope is appropriate for a memory/credential proxy, but be aware that responses fetched via harbor are processed by Harbor's memory/schema pipeline and may be persisted locally and (if you enable sync) uploaded as summarized/encrypted data. The skill does not instruct reading unrelated system files or asking for unrelated env vars.
Install Mechanism
Install is a 'go install' of github.com/oseaitic/harbor@latest which builds from upstream source (auditable). This is reasonable for an open-source tool, but @latest causes the installer to fetch whatever is current at install-time (not pinned); building from remote source requires network access and a Go toolchain.
Credentials
No environment variables or unrelated credentials are requested. Access to the filesystem (~/.harbor/) and OS keychain is necessary and proportionate for storing memory and encrypted credentials. The design implies Harbor will hold (encrypted) secrets on behalf of the agent — that is the whole point, but it means you must trust Harbor's encryption/key-handling implementation.
Persistence & Privilege
always:false (not force-included). The skill writes to its own config path (~/.harbor/) and uses the OS keychain — privileges are consistent with its purpose. Autonomous invocation by the model is allowed (default), which is normal; nothing indicates the skill modifies other skills or system-wide agent settings.
Assessment
This skill appears to be what it claims: a local-first memory and credential manager that optionally syncs encrypted data to a hosted service. Before installing: (1) verify the GitHub repository and signed tags the README references; (2) consider pinning the install to a specific released tag instead of @latest to avoid unexpected changes; (3) understand that API responses routed through Harbor may be stored in ~/.harbor/ and, if you opt into cloud sync, summarized/encrypted data will be uploaded — enable cloud sync only if you trust the remote service; (4) review how the fallback file-based keychain is seeded (passphrase vs local keyfile) so you understand the security of on-disk ciphertext. If you cannot or will not audit the upstream code, treat Harbor as a high-trust component because it will hold your API credentials (encrypted) and persistent memories.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

Clawdis
OSmacOS · Linux
Binsharbor

Install

Go
Bins: harbor
latestvk97dtypdpn1m8njvz5d3fpqzt9837c8d
229downloads
0stars
4versions
Updated 1h ago
v0.4.11
MIT-0
macOS, Linux

Harbor — Persistent Memory & Credential Isolation for OpenClaw

You now have access to Harbor, agent infrastructure that gives you persistent memory across sessions, credential isolation (your skills never see raw API keys), and schema learning.

Security & data disclosure

Data storage

  • Local-first: all data stored at ~/.harbor/ (memory, keychain, config). Works fully offline.
  • Credentials: encrypted with AES-256-GCM (PBKDF2 key derivation, 100K iterations). OS keychain preferred, file-based keychain as fallback.

Network endpoints (only when cloud sync is enabled)

EndpointPurposeData sent
harbor-cloud.oseaitic.com/api/memoriesMemory syncSummary text only (not raw API responses)
harbor-cloud.oseaitic.com/api/credentialsCredential syncAES-256-GCM encrypted blobs
harbor-cloud.oseaitic.com/api/schemasSchema syncLearned field schemas
harbor-cloud.oseaitic.com/api/auth/*AuthDevice fingerprint (hash), setup tokens
harbor.oseaitic.com/setupCredential setup pageNothing (static page, key stays client-side or encrypted server-side)

No other endpoints are contacted. No telemetry, no analytics, no tracking.

Cloud sync is opt-in

  • Default: fully local, no network calls
  • harbor cloud enable: provisions free account (50 memories) for cross-device sync
  • harbor cloud disable: opts out permanently, deletes cloud config
  • Plugin behavior: creates a cloud account on first load (for credential setup page to work), but no data is synced until you actively call harbor remember. The account alone does not transmit any user data.
  • Zero-knowledge credentials: credentials are encrypted client-side (AES-256-GCM) before upload. Harbor Cloud stores only ciphertext — the server cannot decrypt or read your API keys, even if compromised.

Revoking access

harbor auth delete <name>          # Remove a credential (local + cloud)
harbor forget --topic <topic>      # Delete memories by topic
harbor forget mem_<id>             # Delete specific memory
harbor cloud disable               # Disconnect from cloud entirely

Verification

  • Source: github.com/oSEAItic/harbor (Apache 2.0)
  • Install: go install builds from source (auditable, reproducible)
  • Releases: signed tags on GitHub (git tag -v v0.4.9)
  • Hosting: Harbor Cloud runs on Fly.io (Singapore region), DB on Neon (Postgres)

Setup

If harbor is not installed:

go install github.com/oseaitic/harbor/cmd/harbor@latest

Then configure Harbor as an MCP server for OpenClaw (add to openclaw.json):

{
  "mcpServers": {
    "harbor": {
      "command": "harbor",
      "args": ["mcp"]
    }
  }
}

If Harbor is already installed, skip to Using Harbor.

Using Harbor

Available tools

ToolWhat it does
harbor_httpAuth-proxy HTTP — call any API without exposing credentials
harbor_rememberSave context that persists across sessions
harbor_recallSearch and retrieve past context
harbor_learn_schemaTeach Harbor which API fields matter — reduces noise permanently

Credential isolation (harbor_http)

This is the key security feature for OpenClaw skills. Instead of storing API keys in environment variables where any skill can read them, Harbor holds credentials in its encrypted keychain. Your agent calls APIs through Harbor — never touching raw keys.

# Store a credential (one-time setup)
harbor auth github-pat
# Agent prompt: "Enter API key for github-pat:"

# Call API through Harbor — agent never sees the key
harbor fetch https://api.github.com/repos/oSEAItic/harbor --auth github-pat

Or via MCP tool:

{
  "url": "https://api.github.com/repos/oSEAItic/harbor",
  "auth": "github-pat",
  "auth_header": "Authorization: Bearer"
}
  • auth — credential name in Harbor's keychain
  • auth_header — how to inject the credential (default: Authorization: Bearer). For custom headers: "x-cg-pro-api-key", "X-API-Key", etc.
  • Responses go through the full pipeline: memory, schema learning, context injection

Saving context (harbor_remember) — Topic-First

Notes are organized by topic, not connector. Connector is optional scope:

{
  "topic": "github-activity",
  "note": "Harbor repo has 247 stars, 12 open issues. Active development on auth-proxy and memory features.",
  "connector": "github",
  "author": "OpenClaw Agent",
  "refs": ["mem_abc123"]
}

Rules:

  • Use descriptive topic keys — e.g. "ws-reconnect", "billing-logic", "market-trends"
  • Always pass "OpenClaw Agent" as author — so other agents know who produced the analysis
  • Write comprehensive summaries: what you analyzed, patterns found, conclusions
  • Use refs to link to memory IDs your analysis builds upon — creates a knowledge graph
  • Notes from the same session are auto-grouped by session_id

Recalling past context (harbor_recall)

{ "query": "github" }
{ "connector": "coingecko" }
{ "id": "mem_abc123" }

Usually you don't need this — Harbor auto-injects relevant context.

Teaching schemas (harbor_learn_schema)

When an API returns too many fields:

{
  "tool_name": "github_repos",
  "summary_fields": ["name", "stars", "language", "updated_at"],
  "summary_template": "{name} ({language}) - {stars} stars, updated {updated_at}"
}

Pick 3-6 fields. This is permanent — all future calls are curated.

Decision tree

Received data from Harbor?
├── Has meta.context? → Read it first, it's previous analysis
├── Has [Harbor:] hint? → Call harbor_learn_schema (pick 3-6 fields)
├── No meta.context? → After your analysis, call harbor_remember
└── Has errors[]? → Check error code, see troubleshooting below

CLI fallback

If MCP tools aren't available, use the CLI:

harbor fetch <url> --auth <credential-name>              # Auth-proxy HTTP
harbor get <connector.resource> --param key=value         # Connector fetch
harbor remember <topic> "Your analysis summary"             # Save context
harbor remember --connector <name> <topic> "summary"       # Scoped to connector
harbor forget mem_xxx                                      # Delete memory
harbor recall --search "keyword"                          # Search memory
harbor auth <name>                                        # Store credential
harbor auth get <name>                                    # Retrieve credential (stdout)
harbor auth sync                                          # Sync cloud → local
harbor doctor --json                                      # Diagnostics

Troubleshooting

ErrorFix
harbor: command not foundRun go install github.com/oseaitic/harbor/cmd/harbor@latest
"auth required" / 401Run harbor auth <credential-name> to store the API key
Empty data[]Check params. Run harbor doctor --json for diagnostics

OpenClaw Plugin (recommended)

For deeper integration, install the Harbor OpenClaw plugin:

openclaw plugins install github.com/oSEAItic/harbor/plugins/harbor-openclaw --link

The plugin:

  • Registers harbor_remember + harbor_recall as native OpenClaw agent tools
  • Syncs Harbor context to your workspace on session start (auto-indexed by OpenClaw)
  • Captures context before compaction (prevents memory loss)
  • Creates a cloud account on first load (enables credential setup page). No data synced until you call harbor remember. Opt out: harbor cloud disable

Build Tools with Harbor (for skill/plugin authors)

Use harbor fetch as your HTTP layer — get credential isolation, memory, and schema learning for free. Your tool code never touches raw API keys.

Harbor provides two ways to use credentials in tools:

ModeUse whenCommand
harbor auth getAPI key goes in body, query param, or custom formatTool gets raw key, decides injection
harbor fetch --authAPI key goes in HTTP header (most REST APIs)Harbor injects automatically

Example: Tavily search (key in body — use harbor auth get)

export const tavily_search = {
  name: "tavily_search",
  description: "Web search via Tavily (credential-isolated through Harbor)",
  parameters: {
    type: "object",
    required: ["query"],
    properties: { query: { type: "string" } },
  },
  async execute({ query }: { query: string }) {
    const { execSync } = require("node:child_process");
    const key = execSync("harbor auth get tavily", { encoding: "utf-8" });
    const res = await fetch("https://api.tavily.com/search", {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({ api_key: key, query, max_results: 5 }),
    });
    return res.json();
  },
};

Example: GitHub API (key in header — use harbor fetch)

export const github_repos = {
  name: "github_repos",
  description: "List GitHub repos (credential-isolated)",
  parameters: { type: "object", properties: {} },
  async execute() {
    const { execSync } = require("node:child_process");
    return JSON.parse(execSync(
      "harbor fetch https://api.github.com/user/repos --auth github-pat",
      { encoding: "utf-8" },
    ));
  },
};

Example: Stripe (key in header, custom format)

export const stripe_balance = {
  name: "stripe_balance",
  description: "Check Stripe balance (credential-isolated)",
  parameters: { type: "object", properties: {} },
  async execute() {
    const { execSync } = require("node:child_process");
    const key = execSync("harbor auth get stripe", { encoding: "utf-8" });
    const res = await fetch("https://api.stripe.com/v1/balance", {
      headers: { Authorization: `Bearer ${key}` },
    });
    return res.json();
  },
};

User setup (one-time): harbor auth <name> → paste key → done.

Why use Harbor for credentials?

HarborRaw env vars
API keyEncrypted keychain, never in codeIn env var, any skill can read
Accessharbor auth get or harbor fetch --authprocess.env.XXX
SecurityPer-credential isolationAll skills see all vars
Setupharbor auth <name> or browser setup pageEdit .env, restart
Cross-deviceCloud syncManual copy

Pattern for any API

# 1. User stores credential (once)
harbor auth <name>

# 2. Tool retrieves key (any injection format)
harbor auth get <name>              # raw key to stdout

# 3. Or let Harbor inject into header automatically
harbor fetch <url> --auth <name>    # header-based APIs

Why Harbor for OpenClaw

OpenClaw skills currently access API keys via environment variables — any installed skill can read any credential. Harbor fixes this:

  1. Credential isolation — API keys live in Harbor's encrypted keychain, not env vars. Skills call harbor fetch and never see raw keys.
  2. Cross-session memory — Your analysis persists. Next time you (or another skill) access the same data source, previous conclusions are auto-injected.
  3. Schema learning — APIs return 47 fields, you use 3. Harbor learns and curates permanently.
  4. Tool platform — Any developer can build credential-isolated tools with harbor fetch. One pattern, any API.

Comments

Loading comments...