Ollama
Ollama 本地大模型调用技能。支持通过 API 与 Ollama 实例交互进行文本生成。 Use when: (1) 需要调用本地或远程 Ollama 模型 (2) 需要执行 LLM 推理任务 (3) 需要通过 Python 脚本与特定 Ollama 实例 (如 qwen3.5:9b) 交互。
MIT-0 · Free to use, modify, and redistribute. No attribution required.
⭐ 0 · 71 · 1 current installs · 1 all-time installs
by乔焰阳@ayflying
MIT-0
Security Scan
OpenClaw
Benign
high confidencePurpose & Capability
The name/description state an Ollama client and the code implements exactly that: it issues POSTs to an Ollama /api/generate endpoint, supports setting host/model via environment variables, and includes a connectivity test. The declared requirements (none) are plausible for an instruction-only skill that expects the host to be provided or the service to run locally.
Instruction Scope
SKILL.md instructs use of the provided Python scripts and references an .env.example for env-based configuration. The scripts only read OLLAMA_HOST and OLLAMA_MODEL (and optionally load a .env via python-dotenv if available). There is no code reading unrelated system files or exfiltrating data. Note: SKILL.md lists .env.example in the tree, but the provided file manifest does not include it — small documentation / packaging mismatch.
Install Mechanism
No install spec is present; the skill is instruction + scripts only. That minimizes install-time risk (nothing is downloaded or installed by the skill itself). The scripts rely on standard Python libraries and requests; dotenv is optional.
Credentials
The skill requests no required environment variables in metadata, but the code uses OLLAMA_HOST and OLLAMA_MODEL (with defaults). This is proportionate to an HTTP client for Ollama. One item to review: the default OLLAMA_HOST is a hardcoded IP (http://100.66.1.2:11434) rather than localhost or empty, which is unusual and should be validated before running in your environment.
Persistence & Privilege
The skill does not request persistent presence (always:false) and does not modify agent/system configuration. It will only run when invoked and does not grant itself elevated platform privileges.
Assessment
This skill appears to be a straightforward Ollama HTTP client. Before installing or running: (1) inspect and, if necessary, change the default OLLAMA_HOST — the provided default uses an unusual IP (http://100.66.1.2:11434) which may not be what you expect; (2) ensure you run it against a trusted Ollama instance (local or remote) because the script will send prompts to that host; (3) note the SKILL.md mentions a .env.example that wasn't included in the manifest — create or verify your .env to supply OLLAMA_HOST/OLLAMA_MODEL as needed; (4) install only the Python packages you trust (requests, optional python-dotenv); and (5) as with any third-party skill, review the scripts yourself if you plan to run them with sensitive input or in an environment with access to internal networks.Like a lobster shell, security has layers — review code before you run it.
Current versionv0.1.0
Download ziplatest
License
MIT-0
Free to use, modify, and redistribute. No attribution required.
SKILL.md
Ollama 技能
通过 API 方式调用 Ollama 模型。支持自定义 Host、模型名称以及 Prompt 提示词。
核心配置
- 环境变量: 见
.env.example - 默认地址:
http://100.66.1.2:11434 - 默认模型:
qwen3.5:9b(推荐)
使用方法
调用示例
# 基本调用(使用默认模型)
python scripts/ollama_query.py "请简述量子力学"
# 指定模型调用
python scripts/ollama_query.py "写一首关于春天的诗" qwen2.5:7b
安装说明
使用 npx skills 工具安装:
npx skills add ayflying/ai-skills --skill ollama
文件结构
ollama/
├── SKILL.md # 技能说明
├── scripts/
│ ├── ollama_query.py # 模型查询核心脚本
│ └── test_ollama.py # 连通性测试脚本
├── .env.example # 环境变量模板
└── assets/ # 资源目录
注意事项
- 环境准备: 确保本地已安装 Ollama 且服务已启动 (
ollama serve)。 - 模型下载: 使用前需先下载模型,例如
ollama pull qwen3.5:9b。 - Python 依赖: 需要安装
requests库 (pip install requests)。
Files
3 totalSelect a file
Select a file to preview.
Comments
Loading comments…
