Nima Core

Noosphere Integrated Memory Architecture — Complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind,...

MIT-0 · Free to use, modify, and redistribute. No attribution required.
3 · 2.3k · 3 current installs · 3 all-time installs
byLilu@dmdorta1111
MIT-0
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
Name/description (cognitive memory, affect, dream consolidation, hive mind) align with the shipped code: Python core modules, three OpenClaw hooks, and installer scripts. Required binaries (python3, node) match the implementation. Minor incoherence: registry metadata said 'No install spec — instruction-only', but SKILL.md includes an install entry and the package contains install.sh and many code files, which contradicts the 'instruction-only' claim.
Instruction Scope
SKILL.md and install.sh instruct the agent/admin to run install.sh which: creates ~/.nima, copies hooks into ~/.openclaw/extensions, initializes SQLite (and optionally LadybugDB), and pip-installs dependencies. These actions are consistent with a plugin that modifies the agent runtime, but they do grant the skill the ability to write to the user's home and to the OpenClaw extensions directory. The runtime instructions reference OpenClaw config files (~/.openclaw/openclaw.json) and optionally networked LLM/embedder services only when specific env vars are set.
Install Mechanism
There is no remote arbitrary-binary download; installation is via the bundled install.sh which runs pip installs (numpy, pandas, optional real-ladybug) and copies local files into the user's home. Pip installs may be performed globally if no virtualenv is active. No use of obscure download hosts was found, but the installer will perform substantial filesystem changes and install potentially large packages (sentence-transformers optional).
Credentials
No required environment variables are declared, and the listed optional env vars (NIMA_EMBEDDER, VOYAGE_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY, HIVE_REDIS_URL, etc.) are reasonable for the advertised features (embeddings, LLMs, hive/Redis). This is proportionate, but these env vars enable network access and cross-agent sharing (HIVE), so they materially change the skill's privacy and network behavior when set. The installer also may modify system Python packages when run outside a virtualenv.
!
Persistence & Privilege
The skill installs persistent components in the user home (~/.nima) and copies hooks into ~/.openclaw/extensions so the agent will automatically use the hooks thereafter. While that is expected for an OpenClaw plugin, it is a high-impact change to the agent runtime: it can cause persistent capture/injection of memories and (if HIVE_ENABLED set) sharing via Redis or LadybugDB. The skill does not set always:true and does not require elevated permissions, but the installer performs persistent modifications that are non-trivial to reverse without following uninstall steps.
What to consider before installing
This package largely implements what it claims, but exercise care before installing: - Treat install.sh as the critical file: read it (and scripts/init_db.py) entirely before running. It will create ~/.nima, initialize databases, copy OpenClaw hooks into ~/.openclaw/extensions, and pip-install packages (possibly globally if not run in a venv). - Prefer installing inside an isolated environment (container or Python virtualenv) so pip operations don’t change system Python packages. - If you care about privacy, keep NIMA_EMBEDDER unset (defaults to local) and do NOT set VOYAGE_API_KEY / OPENAI_API_KEY / ANTHROPIC_API_KEY. Enabling HIVE_ENABLED and HIVE_REDIS_URL enables multi-agent sharing of memory — only do that with trusted infrastructure. - The registry metadata has inconsistencies (claims instruction-only but includes install scripts and many code files). Try to verify the upstream source: visit the GitHub URL in SKILL.md and confirm the author/repo and release checksums before trusting the package. - If you proceed: run doctor.sh after installation, back up ~/.openclaw/openclaw.json beforehand, review logs under ~/.nima/logs, and consider testing in a disposable agent/container first. If you want, I can extract and summarize install.sh, list all places that perform network calls, or highlight any code paths that write networked embeddings or send data to Redis/Ladybug/OpenAI so you can make a targeted audit.

Like a lobster shell, security has layers — review code before you run it.

Current versionv3.3.0
Download zip
dreamvk97fr9czncpk8q19n6d17knsjs81g503latestvk9767g78hda2kaknss7r1dm1ks82803ememoryvk97fr9czncpk8q19n6d17knsjs81g503v2.5.0vk9732ah5r4tc1cbgf8jwv5w61981jek8

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Runtime requirements

🧠 Clawdis
Binspython3, node

SKILL.md

NIMA Core 3.2

Noosphere Integrated Memory Architecture — A complete cognitive stack for AI agents: persistent memory, emotional intelligence, dream consolidation, hive mind, and precognitive recall.

Website: https://nima-core.ai · GitHub: https://github.com/lilubot/nima-core

Quick Start

pip install nima-core && nima-core

Your bot now has persistent memory. Zero config needed.

What's New in v3.0

Complete Cognitive Stack

NIMA evolved from a memory plugin into a full cognitive architecture:

ModuleWhat It DoesVersion
Memory Capture3-layer capture (input/contemplation/output), 4-phase noise filteringv2.0
Semantic RecallVector + text hybrid search, ecology scoring, token-budgeted injectionv2.0
Dynamic AffectPanksepp 7-affect emotional state (SEEKING, RAGE, FEAR, LUST, CARE, PANIC, PLAY)v2.1
VADER AnalyzerContextual sentiment — caps boost, negation, idioms, degree modifiersv2.2
Memory PrunerLLM distillation of old conversations → semantic gists, 30-day suppression limbov2.3
Dream ConsolidationNightly synthesis — extracts insights and patterns from episodic memoryv2.4
Hive MindMulti-agent memory sharing via shared DB + optional Redis pub/subv2.5
PrecognitionTemporal pattern mining → predictive memory pre-loadingv2.5
Lucid MomentsSpontaneous surfacing of emotionally-resonant memoriesv2.5
Darwinian MemoryClusters similar memories, ghosts duplicates via cosine + LLM verificationv3.0
InstallerOne-command setup — LadybugDB, hooks, directories, embedder configv3.0

v3.0 Highlights

  • All cognitive modules unified under a single package
  • Installer (install.sh) for zero-friction setup
  • All OpenClaw hooks bundled and ready to drop in
  • README rewritten, all versions aligned to 3.0.4

Architecture

OPENCLAW HOOKS
├── nima-memory/          Capture hook (3-layer, 4-phase noise filter)
│   ├── index.js          Hook entry point
│   ├── ladybug_store.py  LadybugDB storage backend
│   ├── embeddings.py     Multi-provider embedding (Voyage/OpenAI/Ollama/local)
│   ├── backfill.py       Historical transcript import
│   └── health_check.py   DB integrity checks
├── nima-recall-live/     Recall hook (before_agent_start)
│   ├── lazy_recall.py    Current recall engine
│   └── ladybug_recall.py LadybugDB-native recall
├── nima-affect/          Affect hook (message_received)
│   ├── vader-affect.js   VADER sentiment analyzer
│   └── emotion-lexicon.js Emotion keyword lexicon
└── shared/               Resilient wrappers, error handling

PYTHON CORE (nima_core/)
├── cognition/
│   ├── dynamic_affect.py         Panksepp 7-affect system
│   ├── emotion_detection.py      Text emotion extraction
│   ├── affect_correlation.py     Cross-affect analysis
│   ├── affect_history.py         Temporal affect tracking
│   ├── affect_interactions.py    Affect coupling dynamics
│   ├── archetypes.py             Personality baselines (Guardian, Explorer, etc.)
│   ├── personality_profiles.py   JSON personality configs
│   └── response_modulator_v2.py  Affect → response modulation
├── dream_consolidation.py        Nightly memory synthesis engine
├── memory_pruner.py              Episodic distillation + suppression
├── hive_mind.py                  Multi-agent memory sharing
├── precognition.py               Temporal pattern mining
├── lucid_moments.py              Spontaneous memory surfacing
├── connection_pool.py            SQLite pool (WAL, thread-safe)
├── logging_config.py             Singleton logger
└── metrics.py                    Thread-safe counters/timings

Privacy & Permissions

  • ✅ All data stored locally in ~/.nima/
  • ✅ Default: local embeddings = zero external calls
  • ✅ No NIMA-owned servers, no proprietary tracking, no analytics sent to external services
  • ⚠️ Opt-in networking: HiveMind (Redis pub/sub), Precognition (LLM endpoints), LadybugDB migrations — see Optional Features below
  • 🔒 Embedding API calls only when explicitly enabling (VOYAGE_API_KEY, OPENAI_API_KEY, etc.)

Optional Features with Network Access

FeatureEnv VarNetwork Calls ToDefault
Cloud embeddingsNIMA_EMBEDDER=voyagevoyage.aiOff
Cloud embeddingsNIMA_EMBEDDER=openaiopenai.comOff
Memory prunerANTHROPIC_API_KEY setanthropic.comOff
Ollama embeddingsNIMA_EMBEDDER=ollamalocalhost:11434Off
HiveMindHIVE_ENABLED=trueRedis pub/subOff
PrecognitionUsing external LLMConfigured endpointOff

Security

What Gets Installed

ComponentLocationPurpose
Python core (nima_core/)~/.nima/Memory, affect, cognition
OpenClaw hooks~/.openclaw/extensions/nima-*/Capture, recall, affect
SQLite database~/.nima/memory/graph.sqlitePersistent storage
Logs~/.nima/logs/Debug logs (optional)

Credential Handling

Env VarRequired?Network Calls?Purpose
NIMA_EMBEDDER=localNoDefault — offline embeddings
VOYAGE_API_KEYOnly if using Voyage✅ voyage.aiCloud embeddings
OPENAI_API_KEYOnly if using OpenAI✅ openai.comCloud embeddings
ANTHROPIC_API_KEYOnly if using pruner✅ anthropic.comMemory distillation
NIMA_OLLAMA_MODELOnly if using Ollama❌ (localhost)Local GPU embeddings

Recommendation: Start with NIMA_EMBEDDER=local (default). Only enable cloud providers when you need better embedding quality.

Safety Features

  • Input filtering — System messages, heartbeats, and duplicates are filtered before capture
  • FTS5 injection prevention — Parameterized queries prevent SQL injection
  • Path traversal protection — All file paths are sanitized
  • Temp file cleanup — Automatic cleanup of temporary files
  • API timeouts — Network calls have reasonable timeouts (30s Voyage, 10s local)

Best Practices

  1. Review before installing — Inspect install.sh and hook files before running
  2. Backup config — Backup ~/.openclaw/openclaw.json before adding hooks
  3. Don't run as root — Installation writes to user home directories
  4. Use containerized envs — Test in a VM or container first if unsure
  5. Rotate API keys — If using cloud embeddings, rotate keys periodically
  6. Monitor logs — Check ~/.nima/logs/ for suspicious activity

Data Locations

~/.nima/
├── memory/
│   ├── graph.sqlite       # SQLite backend (default)
│   ├── ladybug.lbug       # LadybugDB backend (optional)
│   ├── embedding_cache.db # Cached embeddings
│   └── embedding_index.npy# Vector index
├── affect/
│   └── affect_state.json  # Current emotional state
└── logs/                  # Debug logs (if enabled)

~/.openclaw/extensions/
├── nima-memory/           # Capture hook
├── nima-recall-live/     # Recall hook
└── nima-affect/          # Affect hook

Controls:

{
  "plugins": {
    "entries": {
      "nima-memory": {
        "skip_subagents": true,
        "skip_heartbeats": true,
        "noise_filtering": { "filter_system_noise": true }
      }
    }
  }
}

Configuration

Embedding Providers

ProviderSetupDimsCost
Local (default)NIMA_EMBEDDER=local384Free
Voyage AINIMA_EMBEDDER=voyage + VOYAGE_API_KEY1024$0.12/1M tok
OpenAINIMA_EMBEDDER=openai + OPENAI_API_KEY1536$0.13/1M tok
OllamaNIMA_EMBEDDER=ollama + NIMA_OLLAMA_MODEL768Free

Database Backend

SQLite (default)LadybugDB (recommended)
Text Search31ms9ms (3.4x faster)
Vector SearchExternalNative HNSW (18ms)
Graph QueriesSQL JOINsNative Cypher
DB Size~91 MB~50 MB (44% smaller)

Upgrade: pip install real-ladybug && python -c "from nima_core.storage import migrate; migrate()"

All Environment Variables

# Embedding (default: local)
NIMA_EMBEDDER=local|voyage|openai|ollama
VOYAGE_API_KEY=pa-xxx
OPENAI_API_KEY=sk-xxx
NIMA_OLLAMA_MODEL=nomic-embed-text

# Data paths
NIMA_DATA_DIR=~/.nima
NIMA_DB_PATH=~/.nima/memory/ladybug.lbug

# Memory pruner
NIMA_DISTILL_MODEL=claude-haiku-4-5
ANTHROPIC_API_KEY=sk-ant-xxx

# Logging
NIMA_LOG_LEVEL=INFO
NIMA_DEBUG_RECALL=1

Hooks

HookFiresDoes
nima-memoryAfter saveCaptures 3 layers → filters noise → stores in graph DB
nima-recall-liveBefore LLMSearches memories → scores by ecology → injects as context (3000 token budget)
nima-affectOn messageVADER sentiment → Panksepp 7-affect state → archetype modulation

Installation

./install.sh
openclaw gateway restart

Or manual:

cp -r openclaw_hooks/nima-memory ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-recall-live ~/.openclaw/extensions/
cp -r openclaw_hooks/nima-affect ~/.openclaw/extensions/

Advanced Features

Dream Consolidation

Nightly synthesis extracts insights and patterns from episodic memory:

python -m nima_core.dream_consolidation
# Or schedule via OpenClaw cron at 2 AM

Memory Pruner

Distills old conversations into semantic gists, suppresses raw noise:

python -m nima_core.memory_pruner --min-age 14 --live
python -m nima_core.memory_pruner --restore 12345  # undo within 30 days

Hive Mind

Multi-agent memory sharing:

from nima_core import HiveMind
hive = HiveMind(db_path="~/.nima/memory/ladybug.lbug")
context = hive.build_agent_context("research task", max_memories=8)
hive.capture_agent_result("agent-1", "result summary", "model-name")

Precognition

Temporal pattern mining → predictive memory pre-loading:

from nima_core import NimaPrecognition
precog = NimaPrecognition(db_path="~/.nima/memory/ladybug.lbug")
precog.run_mining_cycle()

Lucid Moments

Spontaneous surfacing of emotionally-resonant memories (with safety: trauma filtering, quiet hours, daily caps):

from nima_core import LucidMoments
lucid = LucidMoments(db_path="~/.nima/memory/ladybug.lbug")
moment = lucid.surface_moment()

Affect System

Panksepp 7-affect emotional intelligence with personality archetypes:

from nima_core import DynamicAffectSystem
affect = DynamicAffectSystem(identity_name="my_bot", baseline="guardian")
state = affect.process_input("I'm excited about this!")
# Archetypes: guardian, explorer, trickster, empath, sage

API

from nima_core import (
    DynamicAffectSystem,
    get_affect_system,
    HiveMind,
    NimaPrecognition,
    LucidMoments,
)

# Affect (thread-safe singleton)
affect = get_affect_system(identity_name="lilu")
state = affect.process_input("Hello!")

# Hive Mind
hive = HiveMind()
context = hive.build_agent_context("task description")

# Precognition
precog = NimaPrecognition()
precog.run_mining_cycle()

# Lucid Moments
lucid = LucidMoments()
moment = lucid.surface_moment()

Changelog

See CHANGELOG.md for full version history.

Recent Releases

  • v3.0.4 (Feb 23, 2026) — Darwinian memory engine, new CLIs, installer, bug fixes
  • v2.5.0 (Feb 21, 2026) — Hive Mind, Precognition, Lucid Moments
  • v2.4.0 (Feb 20, 2026) — Dream Consolidation engine
  • v2.3.0 (Feb 19, 2026) — Memory Pruner, connection pool, Ollama support
  • v2.2.0 (Feb 19, 2026) — VADER Affect, 4-phase noise remediation, ecology scoring
  • v2.0.0 (Feb 13, 2026) — LadybugDB backend, security hardening, 348 tests

License

MIT — free for any AI agent, commercial or personal.

Files

67 total
Select a file
Select a file to preview.

Comments

Loading comments…