{"skill":{"slug":"deepseek-deepseek-v3","displayName":"DeepSeek — DeepSeek-V3, DeepSeek-R1, DeepSeek-Coder on Your Local Devices","summary":"DeepSeek models on your local fleet — DeepSeek-V3, DeepSeek-V3.2, DeepSeek-R1, DeepSeek-Coder routed across multiple devices via Ollama Herd. 7-signal scorin...","tags":{"apple-silicon":"1.0.1","code-generation":"1.0.1","deepseek":"1.0.1","deepseek-coder":"1.0.1","deepseek-coder-v2":"1.0.1","deepseek-r1":"1.0.1","deepseek-v3":"1.0.1","deepseek-v3.2":"1.0.1","latest":"1.0.1","local-llm":"1.0.1","ollama":"1.0.1","reasoning":"1.0.1"},"stats":{"comments":0,"downloads":109,"installsAllTime":0,"installsCurrent":0,"stars":3,"versions":1},"createdAt":1774909948169,"updatedAt":1774918908847},"latestVersion":{"version":"1.0.1","createdAt":1774909948169,"changelog":"Initial public release of DeepSeek models on local hardware through Ollama Herd.\n\n- Run DeepSeek-V3, V3.2, R1, and Coder models locally on Apple Silicon or Linux, with zero cloud costs.\n- Supports automatic fleet routing: selects the best node for each request based on 7-signal scoring; seamless failover and VRAM-aware fallback.\n- Compatible with OpenAI and Ollama APIs for chat, code, image generation, speech-to-text, and embeddings.\n- Provides setup instructions, recommended hardware guidance, and dashboard monitoring at a unified endpoint.\n- Prioritizes privacy, local performance, and user control over model management.","license":"MIT-0"},"metadata":{"os":["darwin","linux"],"systems":null},"owner":{"handle":"twinsgeeks","userId":"s17dgy27g44azc3tday4qh394d83ensj","displayName":"Twin Geeks","image":"https://avatars.githubusercontent.com/u/261838102?v=4"},"moderation":null}