{"skill":{"slug":"local-researcher","displayName":"Local Researcher","summary":"完全本地的深度研究助手 Skill。使用 Ollama 或 LMStudio 本地 LLM 进行迭代式网络研究，生成带引用来源的 Markdown 报告。当用户需要进行隐私优先的研究、本地文档分析或生成结构化研究报告时触发。","tags":{"langchain":"1.0.0","latest":"1.0.0","lmstudio":"1.0.0","local":"1.0.0","ollama":"1.0.0","privacy":"1.0.0","research":"1.0.0"},"stats":{"comments":0,"downloads":123,"installsAllTime":0,"installsCurrent":0,"stars":0,"versions":1},"createdAt":1774239945801,"updatedAt":1774240609257},"latestVersion":{"version":"1.0.0","createdAt":1774239945801,"changelog":"- Initial release of Local Researcher: an entirely local, privacy-preserving research assistant for deep web studies.\n- Supports both Ollama and LMStudio as LLM providers, selectable via environment variables.\n- Iterative, automated web research workflow with multiple search provider options (DuckDuckGo, Tavily, Perplexity, SearXNG).\n- Outputs professional Markdown reports with cited sources, research summaries, and workflow metadata.\n- Flexible configuration for local model selection, research depth, and integration into custom or programmatic workflows.\n- Includes step-by-step guides for installation, Docker deployment, troubleshooting, and advanced customization.","license":"MIT-0"},"metadata":null,"owner":{"handle":"antonia-sz","userId":"s1730srvecg82e9ex26q7shpd184ghxy","displayName":"antonia huang","image":"https://avatars.githubusercontent.com/u/143588581?v=4"},"moderation":{"isSuspicious":true,"isMalwareBlocked":false,"verdict":"suspicious","reasonCodes":["suspicious.llm_suspicious"],"summary":"Detected: suspicious.llm_suspicious","engineVersion":"v2.2.0","updatedAt":1774240609257}}