{"skill":{"slug":"model-distill-master","displayName":"模型蒸馏大师","summary":"模型蒸馏大师：将大模型能力迁移到小模型的完整工作流。 支持自适应蒸馏、课程学习、能力感知、对抗训练、多维度评估。 触发词：「蒸馏模型」「把XX模型蒸馏到YY」「压缩模型」「做小模型」「教师模型分析」。 默认学生模型：gemma-3-4b-it（4B参数，适合边缘部署）。","tags":{"adversarial-training":"1.0.0","curriculum-learning":"1.0.0","edge-deployment":"1.0.0","fine-tuning":"1.0.0","gemma":"1.0.0","knowledge-distillation":"1.0.0","latest":"1.0.0","model-distillation":"1.0.0","qlora":"1.0.0"},"stats":{"comments":0,"downloads":76,"installsAllTime":0,"installsCurrent":0,"stars":0,"versions":1},"createdAt":1776054169190,"updatedAt":1776054410702},"latestVersion":{"version":"1.0.0","createdAt":1776054169190,"changelog":"model-distill-master v1.0.0\n\n- Initial release of 模型蒸馏大师 (Model Distillation Master), providing a comprehensive workflow for distilling large models into smaller ones.\n- Supports adaptive distillation, curriculum learning, capability analysis, adversarial training, and multi-dimensional evaluation.\n- Default student model is gemma-3-4b-it (4B parameters, suitable for edge deployment).\n- Step-by-step, checkpoint-based protocol includes environment setup, teacher model analysis, synthetic data generation, and automated training script/configuration generation.\n- Multi-agent subtask structure for parallel data analysis, distillation data synthesis, and output evaluation.","license":"MIT-0"},"metadata":null,"owner":{"handle":"shixiangyu2","userId":"s172tssf154bb00zr1jxkvpsrs83h4b9","displayName":"ShiXiangYu2","image":"https://avatars.githubusercontent.com/u/215228006?v=4"},"moderation":{"isSuspicious":true,"isMalwareBlocked":false,"verdict":"suspicious","reasonCodes":["suspicious.dynamic_code_execution","suspicious.llm_suspicious"],"summary":"Detected: suspicious.dynamic_code_execution, suspicious.llm_suspicious","engineVersion":"v2.2.0","updatedAt":1776054410702}}