Install
openclaw skills install openclaw-ollama-memorySet up local semantic memory search in OpenClaw using Ollama + nomic-embed-text. Free, private, offline-capable. Replaces cloud embedding APIs (OpenAI, Gemini) with a locally-running model. Trigger phrases: ollama memory, local embeddings, private memory search, offline memory, free embeddings, nomic-embed-text.
openclaw skills install openclaw-ollama-memoryLocal semantic memory search for OpenClaw using Ollama — free, private, no API key required.
Download from https://ollama.ai, then pull the embedding model:
ollama pull nomic-embed-text
Verify Ollama is running:
curl http://127.0.0.1:11434/api/tags
Edit ~/.openclaw/openclaw.json and add under agents.defaults.memorySearch:
{
"agents": {
"defaults": {
"memorySearch": {
"provider": "ollama",
"model": "nomic-embed-text:latest",
"remote": {
"baseUrl": "http://127.0.0.1:11434"
}
}
}
}
}
openclaw gateway restart
provider: "ollama", not "openai". OpenClaw doesn't auto-detect Ollama.http://127.0.0.1:11434 (no /v1 suffix, no trailing slash).references/config-reference.md for advanced tuning (hybrid search, MMR, temporal decay, extra paths).references/troubleshooting.md for common errors and fixes.