Deepseek V4

v1.0.1

Use DeepSeek V4 (Flash & Pro) from the command line — one-shot Q&A, thinking mode, multi-turn chat. OpenAI-compatible API, no special CLI needed. Supports de...

0· 30·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for jiajiaoy/deepseek-v4.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Deepseek V4" (jiajiaoy/deepseek-v4) from ClawHub.
Skill page: https://clawhub.ai/jiajiaoy/deepseek-v4
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required env vars: DEEPSEEK_API_KEY
Required binaries: uv
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install deepseek-v4

ClawHub CLI

Package manager switcher

npx clawhub@latest install deepseek-v4
Security Scan
Capability signals
Requires sensitive credentials
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description request DEEPSEEK_API_KEY and the 'uv' runner and the Python scripts implement an OpenAI-compatible client against https://api.deepseek.com/v1 — this matches the claimed functionality (one-shot Q&A, thinking mode, multi-turn chat).
Instruction Scope
SKILL.md and scripts only instruct reading DEEPSEEK_API_KEY, invoking the DeepSeek API via the OpenAI-compatible client, and using 'uv run' for the scripts. There are no instructions to read unrelated files, other env vars, or send data to unexpected endpoints.
Install Mechanism
Install spec only installs the 'uv' binary via brew which is appropriate for running the provided commands. The Python scripts depend on the 'openai' package but the install spec doesn't list Python dependency installation — this is an operational omission rather than a security red flag.
Credentials
Only DEEPSEEK_API_KEY is required, which is proportional and expected for a client that calls the DeepSeek API. No unrelated credentials or broad filesystem/config paths are requested.
Persistence & Privilege
always:false and no modifications to other skills or system-wide settings. The skill does not request elevated or persistent platform privileges.
Assessment
This skill appears to do exactly what it claims: a small Python wrapper that calls DeepSeek's OpenAI-compatible API. Before installing, confirm you trust the DeepSeek service and the 'uv' Homebrew package on your machine. Note the scripts require the Python 'openai' package (not installed by the brew step) and rely on DEEPSEEK_API_KEY — keep that key private. Also be aware 'thinking mode' streams internal reasoning (chain-of-thought), which can reveal intermediate deductions you may prefer not to log or share. If you need stronger assurance, verify the 'uv' formula source and inspect the installed Python package versions before use.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

🧠 Clawdis
Binsuv
EnvDEEPSEEK_API_KEY

Install

Install uv (brew)
Bins: uv
brew install uv
latestvk97134xb31mq4czf368mj11y5s85ppky
30downloads
0stars
2versions
Updated 3h ago
v1.0.1
MIT-0

DeepSeek V4

Use DeepSeek V4 Flash and Pro directly from your terminal — one-shot questions, deep reasoning with thinking mode, and multi-turn chat. No special CLI required; uses the OpenAI-compatible API via a small Python script.

Setup

1. Get API key: https://platform.deepseek.com/api_keys

2. Set environment variable:

export DEEPSEEK_API_KEY=your_key_here
# Add to ~/.zshrc or ~/.bashrc to persist

Models

ModelIDBest forPrice (input/output)
V4 Flash ⚡deepseek-v4-flashQ&A, writing, coding, summaries$0.014 / $0.028 per 1M
V4 Pro 🚀deepseek-v4-proHard reasoning, math, deep analysis$0.174 / $0.348 per 1M

Both support 1M token context. Cache hits are 10× cheaper.

Legacy aliases (deepseek-chat → flash, deepseek-reasoner → pro) deprecated 2026-07-24.

Commands

One-shot question (Flash — fast & cheap)

uv run {baseDir}/scripts/ask.py "Explain the difference between V4 Flash and V4 Pro"

One-shot with Pro model

uv run {baseDir}/scripts/ask.py "Write a merge sort in Rust" --model pro

Thinking mode (Pro with visible reasoning trace)

uv run {baseDir}/scripts/ask.py "Prove that there are infinitely many primes" --think

Multi-turn chat

uv run {baseDir}/scripts/chat.py --model flash
uv run {baseDir}/scripts/chat.py --model pro --think

Show models & pricing

uv run {baseDir}/scripts/models.py

Model Selection Guide

Use Flash when:

  • Everyday Q&A and explanations
  • Writing, editing, translation
  • Code generation and review
  • Summarization and classification
  • Cost is a priority

Use Pro when:

  • Multi-step math or logic problems
  • Complex debugging or architecture decisions
  • Deep research and analysis
  • You want to see the reasoning process (--think)

Tips

  • Thinking mode (--think) streams the internal reasoning before the final answer — useful for hard problems and to verify correctness
  • System prompt: --system "You are a concise assistant" to set tone
  • No streaming: --no-stream for cleaner output in scripts
  • DeepSeek's API is OpenAI-compatible — any OpenAI SDK works with base_url="https://api.deepseek.com/v1"

Comments

Loading comments...