{"skill":{"slug":"linux-ollama","displayName":"Linux Ollama","summary":"Linux Ollama — run Ollama on Linux with fleet routing across multiple Linux machines. Linux Ollama setup for Llama, Qwen, DeepSeek, Phi, Mistral. Route Ollam...","tags":{"latest":"1.0.0"},"stats":{"comments":0,"downloads":124,"installsAllTime":2,"installsCurrent":2,"stars":0,"versions":1},"createdAt":1775256826085,"updatedAt":1775256850529},"latestVersion":{"version":"1.0.0","createdAt":1775256826085,"changelog":"Initial release of Linux Ollama — multi-machine Ollama fleet routing for Linux.\n\n- Run Ollama inference across multiple Linux machines with automatic load balancing and routing.\n- Supports setup for Llama, Qwen, DeepSeek, Phi, Mistral models on servers, desktops, and edge devices.\n- Integrates with systemd for reliable service management.\n- Provides a dashboard, fleet health checks, and structured logging.\n- Includes firewall configuration, GPU/CPU tips, and OpenAI API compatibility.\n- No automatic model downloads or deletions; all actions require user confirmation.","license":"MIT-0"},"metadata":{"os":["linux"],"systems":null},"owner":{"handle":"twinsgeeks","userId":"s17dgy27g44azc3tday4qh394d83ensj","displayName":"Twin Geeks","image":"https://avatars.githubusercontent.com/u/261838102?v=4"},"moderation":null}