Ollama

v1.0.0

Run, tune, and troubleshoot local Ollama models with reliable API patterns, Modelfiles, embeddings, and hardware-aware deployment workflows.

0· 550·9 current·10 all-time
byIván@ivangdavila
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (local Ollama model management) align with required binaries (ollama) and optional tools (curl, jq). Required config paths (~/ollama/, ~/.ollama/) are consistent with storing durable local state for this purpose.
Instruction Scope
Runtime instructions focus on local CLI and local HTTP API calls (127.0.0.1:11434), Modelfile workflows, embeddings, and persistent notes under ~/ollama/. The skill explicitly instructs asking for user consent before creating memory files; nevertheless, any install operator should confirm before allowing writes to the home-directory memory.
Install Mechanism
Instruction-only skill with no install spec and no remote download/extract steps. Lowest-risk install posture (relies on existing 'ollama' binary and standard CLI tools).
Credentials
No environment variables or credentials requested. The only nontrivial access is to user config paths in the home directory (~/ollama/, ~/.ollama/) for durable state — this is proportional to the stated purpose but is a persistence/privacy consideration the user should accept explicitly.
Persistence & Privilege
always:false (not force-included). The skill can be invoked autonomously (platform default) but does not request elevated persistence or modify other skills. It documents guardrails around remote exposure and requests explicit approval for non-local binding or service-manager changes.
Assessment
This skill appears coherent for managing local Ollama models. Before installing or using it: 1) confirm you have the official 'ollama' binary on the machine (do not run untrusted installers); 2) review and explicitly approve any writes to ~/ollama/ (the skill stores durable operational notes there); 3) keep port 11434 bound to localhost unless you explicitly approve and configure auth/firewalls; 4) the skill may run local commands (ollama list/serve/run/etc.) — only grant access on machines you control. If you plan to use any cloud paths, verify the skill asks you first and inspect the exact endpoints before enabling them.

Like a lobster shell, security has layers — review code before you run it.

latestvk972p6xn8cr6gp26cb1nn9799h82rc1v

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Runtime requirements

🦙 Clawdis
OSLinux · macOS · Windows
Binsollama
Any bincurl, jq
Config~/ollama/, ~/.ollama/

Comments