Dify
v1.0.0Dify AI应用开发平台指南。用于构建LLM应用、工作流、Agent和知识库。当用户需要(1)使用Dify创建AI应用 (2)设计LLM工作流 (3)配置知识库RAG (4)开发AI Agent (5)调用Dify API (6)自托管部署Dify时激活此技能。
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
The name/description match the content: this is a Dify guide for building apps, workflows, agents, RAG KBs and for self-hosting. However, the declared requirements list no required binaries or env vars while the SKILL.md contains concrete shell commands that assume git, curl, jq, docker and docker-compose are available — a minor metadata inconsistency.
Instruction Scope
SKILL.md and the reference files are documentation and example API calls. The instructions include cloning from GitHub and running docker-compose, plus example HTTP requests requiring an API key in an Authorization header. They do not instruct reading unrelated local credentials, exfiltrating secrets, or calling unknown endpoints beyond Dify's documented URLs. The presence of deployment commands is expected for a self-hosting guide.
Install Mechanism
This is an instruction-only skill with no install spec and no code files. That is low-risk: nothing will be written or executed by the skill itself beyond the agent following textual instructions.
Credentials
The docs reference an API Key for authenticating to Dify APIs, but the skill declares no required env vars. No other credentials or unrelated environment variables are requested. The lack of declared primaryEnv or required env vars is consistent with a public documentation skill, but the guide does show places where a user would need to supply API keys when actually using the API.
Persistence & Privilege
always:false and default invocation settings are used. The skill does not request persistent/privileged presence or modify other skills or system settings.
Assessment
This skill is a documentation/instruction pack for Dify and appears coherent with that purpose. Before installing or having an agent execute the example commands, note: (1) the SKILL.md examples assume tools (git, curl, jq, docker, docker-compose) that the metadata did not list — make sure those binaries are available and trusted; (2) deployment examples clone the public GitHub repo and run docker-compose, so verify the repo and .env.example contents before running to avoid accidentally exposing secrets or running unreviewed containers; (3) the API examples show using a Bearer API key — the skill does not request any secrets itself, so only provide keys when you trust the target endpoint; (4) if you plan to allow the agent to execute shell/network actions autonomously, be cautious — those actions can start containers or make network requests. If you want higher assurance, inspect the upstream GitHub repository indicated (https://github.com/langgenius/dify) and the Docker images used before running them.Like a lobster shell, security has layers — review code before you run it.
latest
Dify AI应用开发平台
Dify是开源的LLM应用开发平台,支持快速构建AI应用、工作流和Agent。
核心概念
应用类型
- Chat App - 对话型应用,支持会话持久化,适合聊天机器人、客服AI
- Workflow App - 工作流应用,无状态执行,适合翻译、写作、摘要
- Agent - 智能体应用,支持工具调用和自主规划
- Completion App - 文本补全应用,单次请求响应
核心组件
- Studio - 应用构建工作台
- Knowledge Base - 知识库,支持RAG检索增强
- Model Providers - 模型提供商配置
- Tools - 工具集成
工作流节点
输入输出
- User Input - 用户输入,定义输入变量
- Output - 输出结果
逻辑控制
- IF/ELSE - 条件分支
- Iteration - 迭代循环
- Parallel - 并行执行
数据处理
- Parameter Extractor - 参数提取,用LLM从自然语言提取结构化数据
- List Operator - 列表操作,过滤和转换数组
- Variable Aggregator - 变量聚合
- Template - 模板渲染
LLM节点
- LLM - 大语言模型调用
- Question Classifier - 问题分类
- Knowledge Retrieval - 知识检索
工具节点
- HTTP Request - HTTP请求
- Code - 代码执行(Python/JavaScript)
- Tool - 工具调用
快速开始
部署Dify (Docker)
# 克隆最新版本
git clone --branch "$(curl -s https://api.github.com/repos/langgenius/dify/releases/latest | jq -r .tag_name)" https://github.com/langgenius/dify.git
# 启动
cd dify/docker
cp .env.example .env
docker compose up -d
访问 http://localhost/install 初始化管理员账户。
系统要求
- CPU >= 2 Core
- RAM >= 4 GiB
- Docker 19.03+
- Docker Compose 1.28+
API调用
认证
所有API请求需要在Header中携带API Key:
Authorization: Bearer {api_key}
执行工作流
POST /v1/workflows/run
Content-Type: application/json
{
"inputs": {
"query": "翻译这段文字..."
},
"response_mode": "blocking", # 或 "streaming"
"user": "user-123"
}
发送聊天消息
POST /v1/chat-messages
Content-Type: application/json
{
"query": "你好",
"response_mode": "streaming",
"user": "user-123",
"conversation_id": "" # 首次为空,后续传入返回的conversation_id
}
响应模式
- blocking - 同步等待完整响应
- streaming - SSE流式响应
知识库
创建知识库
- Studio → Knowledge → Create Knowledge
- 上传文档 (支持txt, markdown, pdf, docx, html等)
- 选择索引模式:
- High Quality - 高质量索引,需要Embedding模型
- Economy - 经济模式,关键词检索
检索设置
- Vector Search - 向量检索
- Full Text Search - 全文检索
- Hybrid Search - 混合检索
在应用中使用
工作流中添加 Knowledge Retrieval 节点,选择知识库。
详细参考
资源链接
Comments
Loading comments...
