{"skill":{"slug":"offline-llama","displayName":"Offline Llama","summary":"Manage local Ollama models autonomously with health monitoring, automatic fallback, self-healing, and offline operation without internet dependency.","tags":{"latest":"1.0.0"},"stats":{"comments":0,"downloads":788,"installsAllTime":7,"installsCurrent":4,"stars":0,"versions":1},"createdAt":1771398438302,"updatedAt":1777525218680},"latestVersion":{"version":"1.0.0","createdAt":1771398438302,"changelog":"Initial release: Enables autonomous, offline-first management and health monitoring of local Ollama models.\n\n- Monitors local model health with automatic fallback and dynamic model switching.\n- Includes self-healing: restarts Ollama, clears resources, reinstalls models if needed.\n- Maintains continuous operation and degraded mode when all models are unavailable.\n- Detects internet connectivity to switch between local and remote models when necessary.\n- Provides commands for manual control (status checks, switching, restarting, clearing cache).\n- Fully local operation—no external dependencies, preserving privacy.","license":null},"metadata":null,"owner":{"handle":"and-ray-m","userId":"publishers:and-ray-m","displayName":"and-ray-m","image":"https://avatars.githubusercontent.com/u/257538632?v=4"},"moderation":{"isSuspicious":true,"isMalwareBlocked":false,"verdict":"suspicious","reasonCodes":["suspicious.llm_suspicious","suspicious.vt_suspicious"],"summary":"Detected: suspicious.llm_suspicious, suspicious.vt_suspicious","engineVersion":"v2.4.5","updatedAt":1777525218680}}