Mistral Mcp Openclaw

v0.1.0

Configure OpenClaw to use the community mistral-mcp stdio server for Mistral OCR, Codestral FIM, Voxtral audio, moderation, classification, files, batch, and...

1· 38·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for swih/mistral-mcp-openclaw.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Mistral Mcp Openclaw" (swih/mistral-mcp-openclaw) from ClawHub.
Skill page: https://clawhub.ai/swih/mistral-mcp-openclaw
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Required env vars: MISTRAL_API_KEY
Required binaries: openclaw, node, npm
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install mistral-mcp-openclaw

ClawHub CLI

Package manager switcher

npx clawhub@latest install mistral-mcp-openclaw
Security Scan
Capability signals
Requires sensitive credentials
These labels describe what authority the skill may exercise. They are separate from suspicious or malicious moderation verdicts.
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name/description state this is an OpenClaw MCP adapter for Mistral. The declared requirements (openclaw, node, npm) and the single environment variable (MISTRAL_API_KEY) match that purpose — a stdio MCP wrapper needs a binary and the service API key. Nothing asked-for is unrelated to Mistral/OpenClaw integration.
Instruction Scope
SKILL.md contains straightforward setup steps: install the mistral-mcp npm package, set MISTRAL_API_KEY in the environment, and register the stdio MCP with openclaw. The instructions do not request reading unrelated files, scanning other credentials, or sending data to unexpected endpoints. The doc also warns not to paste API keys into chat.
Install Mechanism
Installation is via an npm package (mistral-mcp) which is appropriate for a Node stdio adapter, but npm packages are a moderate-risk install source compared to packaged binaries from vetted repos. The SKILL.md recommends a global (-g) install which modifies the system environment; users should review the package source and version before global installation or prefer local/isolated installs (or npx) if concerned.
Credentials
Only MISTRAL_API_KEY is required and it is declared as the primary credential. There are no unrelated secret env vars requested and the SKILL.md uses the key only to register the MCP server. This is proportionate to the stated functionality.
Persistence & Privilege
The skill is not forced-always, is user-invocable, and does not declare any system-wide configuration changes beyond installing an npm binary and registering an MCP entry in OpenClaw (which is appropriate for this integration). It does not request or modify other skills' configs.
Assessment
This skill is internally consistent with its stated purpose, but it relies on a community npm package. Before installing: (1) review the mistral-mcp GitHub repo and npm package (source, recent activity, issues, and version) to ensure it meets your security standards; (2) avoid pasting API keys into chat — use environment variables or a secret manager as recommended; (3) prefer installing the package locally or using npx in an isolated environment rather than a global -g install if you want to limit system-wide changes; (4) verify your Mistral account limits and billing to avoid unexpected charges when using OCR/transcription/batch workloads.

Like a lobster shell, security has layers — review code before you run it.

Runtime requirements

🌊 Clawdis
Binsopenclaw, node, npm
EnvMISTRAL_API_KEY
Primary envMISTRAL_API_KEY

Install

Node
Bins: mistral-mcp
npm i -g mistral-mcp
latestvk974ba1tvsb5j2dqdv7vcmqqyh85pdza
38downloads
1stars
1versions
Updated 3h ago
v0.1.0
MIT-0

Mistral MCP for OpenClaw

Created by the maintainer of mistral-mcp. This is a community skill, not an official OpenClaw or Mistral integration.

Use this skill when you want OpenClaw to access Mistral capabilities beyond the built-in chat/model routing provider:

  • OCR for documents and images
  • Codestral fill-in-the-middle (FIM) code completion
  • Voxtral transcription and speech tools
  • Moderation and classification endpoints
  • Files and batch API workflows
  • Live model and voice resources

OpenClaw already includes a built-in Mistral provider for chat. This skill is for tool-level MCP access alongside that provider.

Requirements

  • Node.js 18+
  • OpenClaw CLI
  • MISTRAL_API_KEY in your environment
  • mistral-mcp installed from npm

Setup

Install the MCP server package globally:

npm install -g mistral-mcp

Set your Mistral API key in your shell environment:

export MISTRAL_API_KEY="sk-..."

Register the stdio MCP server in OpenClaw:

openclaw mcp set mistral '{"command":"mistral-mcp","env":{"MISTRAL_API_KEY":"${MISTRAL_API_KEY}"}}'

Check the saved definition:

openclaw mcp show mistral --json

When to use it

Use this skill for workflows where the agent needs a Mistral-specific tool, not just a chat model:

  • Extract text from a PDF or image with OCR
  • Ask Codestral for FIM / inline code completion
  • Transcribe or generate audio with Voxtral
  • Run moderation or classification before taking an action
  • Submit larger async workloads through the batch API
  • Inspect live model and voice catalogs as MCP resources

Safety notes

  • Do not paste API keys into chat or commit them to source files. Prefer environment variables or your normal secret manager.
  • Review the mistral-mcp package before installing it if you operate in a sensitive workspace. Source: https://github.com/Swih/mistral-mcp.
  • Mistral has its own pricing and rate limits. Check the current Mistral plan and usage policies before running batch or large transcription workloads.

Comments

Loading comments...