Install
openclaw skills install memori-extensionMemory augmentation and LLM call interception using the Memori Python library with optional Zhipu API integration.
openclaw skills install memori-extensionMemory augmentation and LLM call interception using the Memori Python library with optional Zhipu AI API integration.
This skill provides memory augmentation and LLM call interception capabilities using the Memori Python library. It enables agents to retrieve relevant knowledge from a memory database and inject it into conversations.
⚠️ Important: This skill performs the following operations:
./memori.db)./config/tech_terms.txt)ZHIPUAI_API_KEY environment variable is setZHIPUAI_API_KEY is NOT set, the skill operates 100% locally with NO external network callsZHIPUAI_API_KEY first to verify local functionalityZHIPUAI_API_KEY if you explicitly consent to sending conversation data to external servicesThis skill requires the following Python packages:
pip install memori
The memori library is licensed under Apache License Version 2.0.
For Zhipu API augmentation (optional):
pip install zhipuai
Warning: Installing zhipuai and providing ZHIPUAI_API_KEY enables conversation content to be sent to external servers.
| Variable | Description | Required | Default | Privacy Note |
|---|---|---|---|---|
ZHIPUAI_API_KEY | Zhipu AI API key for conversation augmentation | No | - | ⚠️ Enables external API calls - conversation text will be sent to Zhipu AI servers |
ZHIPUAI_MODEL | Zhipu AI model name | No | glm-4.7 | Only used if ZHIPUAI_API_KEY is set |
MEMORI_TECH_TERMS | Comma-separated technical terms for LLM interception | No | - | Local only |
MEMORI_TECH_TERMS_FILE | Path to file containing technical terms (one per line) | No | ./config/tech_terms.txt | Local only - read/write |
MEMORI_DB_PATH | Path to Memori database | No | ./memori.db | Local only - read/write |
Privacy & Data Flow Notes:
ZHIPUAI_API_KEY enables external network calls. All other variables control local file operations.ZHIPUAI_API_KEY, the skill operates 100% locally with no external data transmission.memori.db) and optionally the tech terms file (tech_terms.txt). These are stored on your local filesystem.ZHIPUAI_API_KEY to test local functionality. Only enable external API if you need enhanced features and consent to the data transmission.System environment:
# Optional: Enable Zhipu API augmentation
export ZHIPUAI_API_KEY="your-api-key"
export ZHIPUAI_MODEL="glm-4.7"
# Optional: Customize technical terms
export MEMORI_TECH_TERMS="FFI,Rust,Linux,kernel,spinlock"
# Optional: Use custom database path
export MEMORI_DB_PATH="/path/to/memori.db"
OpenClaw configuration (openclaw.json):
{
"skills": {
"entries": {
"memori-extension": {
"enabled": true,
"env": {
"ZHIPUAI_API_KEY": "your-api-key",
"ZHIPUAI_MODEL": "glm-4.7",
"MEMORI_TECH_TERMS": "FFI,Rust,Linux,kernel",
"MEMORI_DB_PATH": "./memori.db"
}
}
}
}
}
Technical terms file (optional):
# Create config directory
mkdir -p config
# Create terms file
cat > config/tech_terms.txt << EOF
FFI
Rust
Linux
kernel
spinlock
mutex
unsafe
EOF
# Set environment variable
export MEMORI_TECH_TERMS_FILE="config/tech_terms.txt"
from memori import Memori
memori = Memori(
db_path="memori.db",
entity_id="knowledge-base"
)
# Search memories
memories = memori.search("query", limit=5)
# Augment query
context = memori.augment("How to handle spinlock conflicts?", limit=3)
# Store memory
memory_id = memori.store("New content")
# Get stats
stats = memori.get_stats()
# Close
memori.close()
from skills.memori_extension import search, augment, intercept_llm
# Search
memories = search("FFI bindings", limit=5)
# Augment
enhanced = augment("How to handle spinlock conflicts?")
if enhanced:
print(enhanced)
# Intercept LLM
messages = [{"role": "user", "content": "FFI question"}]
enhanced = intercept_llm(messages)
__init__(db_path, entity_id)Initialize Memori instance.
Parameters:
db_path (str | Path, optional): Database pathentity_id (str, optional): Entity ID, default "default"search(query, limit, entity_id)Search for relevant memories.
Returns: List[Memory]
augment(query, limit, entity_id)Augment query with retrieved memories.
Returns: AugmentedContext
store(content, entity_id, metadata)Store a new memory.
Returns: int - Memory ID
get_stats(entity_id)Get statistics.
Returns: dict
close()Close database connection.
Memory object with attributes:
id (int): Memory IDentity_id (str): Entity IDcontent (str): Memory contentcreated_at (str): Creation timestampmetadata (dict, optional): MetadataAugmented context with:
original_query (str): Original queryretrieved_memories (List[Memory]): Retrieved memoriesenhanced_prompt (str): Augmented prompthas_memories (bool): Whether memories were retrievedmemories_count (int): Number of retrieved memoriesDefault database path: ./memori.db
The skill will:
The skill uses configurable technical terms for LLM call interception. You can customize these via:
1. Environment variable (comma-separated):
export MEMORI_TECH_TERMS="FFI,Rust,Linux,kernel,spinlock,mutex,unsafe"
2. Configuration file (one term per line):
# Create config file
mkdir -p config
cat > config/tech_terms.txt << EOF
FFI
Rust
Linux
kernel
spinlock
mutex
unsafe
EOF
# Set environment variable
export MEMORI_TECH_TERMS_FILE="config/tech_terms.txt"
Note: If neither is set, the skill will still work but may not intercept technical queries as effectively.
This skill performs the following file operations:
| Operation | File | Description |
|---|---|---|
| Read | ./memori.db (default) | Retrieve stored memories |
| Write | ./memori.db (default) | Store new memories |
| Read | MEMORI_TECH_TERMS_FILE | Load technical terms |
| Write | MEMORI_TECH_TERMS_FILE | Persist terms (if enabled) |
⚠️ Important: If ZHIPUAI_API_KEY is set, this skill may send conversation text to Zhipu AI's servers for augmentation.
To disable external API calls:
ZHIPUAI_API_KEYskills/memori_extension/
├── __init__.py # Skill entry
├── memori_extension.py # Skill implementation
├── SKILL.md # This file
└── README.md # Quick start guide
This skill is licensed under Apache License Version 2.0.
This skill uses the memori Python library, which is also licensed under Apache License Version 2.0.
This skill incorporates the Memori Python library, which is licensed under the Apache License 2.0. See the LICENSE file for the complete license text.
Memori Library: