{"skill":{"slug":"ollama-memory-setup","displayName":"Ollama Memory Setup","summary":"Sets up local semantic memory search for OpenClaw using Ollama + nomic-embed-text. Use when: (1) memory_search returns 'node-llama-cpp is missing' or 'Local...","tags":{"latest":"1.0.0"},"stats":{"comments":0,"downloads":140,"installsAllTime":0,"installsCurrent":0,"stars":0,"versions":1},"createdAt":1774747262094,"updatedAt":1774747609646},"latestVersion":{"version":"1.0.0","createdAt":1774747262094,"changelog":"Initial release — enables local semantic memory search with Ollama for OpenClaw.\n\n- Provides a local, private alternative to node-llama-cpp using Ollama’s embedding API.\n- Fixes errors when node-llama-cpp is missing or fails by routing embeddings via Ollama and nomic-embed-text.\n- No external API keys or cloud required; works fully offline on macOS and Linux.\n- Includes both automatic and manual setup instructions.\n- Troubleshooting guide provided for common issues.","license":"MIT-0"},"metadata":null,"owner":{"handle":"brasco05","userId":"s177sz08tyrpncx75y5y4em2rn83v35v","displayName":"brasco05","image":"https://avatars.githubusercontent.com/u/256532966?v=4"},"moderation":null}