Mistral

v1.0.0

欧洲AI的希望:23岁信息学奥赛金牌得主创立的模型公司

0· 44·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for hanxueyuan/mistral.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Mistral" (hanxueyuan/mistral) from ClawHub.
Skill page: https://clawhub.ai/hanxueyuan/mistral
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install mistral

ClawHub CLI

Package manager switcher

npx clawhub@latest install mistral
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The skill's name and description match the SKILL.md content (an informational overview of Mistral AI). It does not request unrelated credentials, binaries, or config paths.
Instruction Scope
SKILL.md contains only descriptive content and a short 'read_when' list indicating when to consult it. It does not instruct the agent to run commands, read files, access environment variables, or send data to external endpoints.
Install Mechanism
No install spec and no code files are present; this is instruction-only and nothing will be written to disk or executed by an installer.
Credentials
The skill declares no required environment variables or credentials. There is no disproportionate request for secrets or unrelated service tokens.
Persistence & Privilege
always is false and the skill does not request persistent or elevated privileges. Model invocation is allowed (platform default) but the skill itself has no autonomous access to secrets or system state.
Assessment
This skill is basically a static informational article about Mistral AI and appears coherent and low-risk. Before relying on it for decisions, verify facts from primary sources (Mistral's official blog, Hugging Face pages, or reputable news outlets) because the skill provides summary content but no provenance or links. If you plan to automate actions based on this content, remember the skill has no external verification steps and could become out-of-date.

Like a lobster shell, security has layers — review code before you run it.

latestvk97ce1rqfrfw6kb7cq5qjsa31n85kt59
44downloads
0stars
1versions
Updated 1d ago
v1.0.0
MIT-0

Mistral AI:23岁天才和3天训练的模型

核心定位:欧洲领先的大语言模型公司,由前Meta FAIR研究员创立,专注开源AI模型和高效推理,挑战美国科技巨头的AI霸权。


创始人的故事

Arthur Mensch不是一个普通的创业者。23岁拿到国际信息学奥林匹克竞赛金牌,24岁成为DeepMind最年轻的研究员,后来转到Meta FAIR实验室。2023年4月,他带着两位前同事一起离开了Meta,在巴黎创立了Mistral AI。

四个月后的7月,他们发布了第一个模型——Mistral 7B。这个只有70亿参数的模型,性能超越了Meta自己130亿参数的Llama 2。更惊人的是:训练只用了3天,成本不到10万美元。

这个消息在AI圈炸了——它打破了"大模型必须大投入"的行业共识。


快速迭代

  • 2023.07:Mistral 7B发布
  • 2023.12:3.85亿欧元A轮融资(欧洲AI最大单笔)
  • 2024.02:Mixtral 8x7B(MoE混合专家模型)
  • 2024.06:Mistral Large 2
  • 2024.09:Codestral代码模型
  • 2025:Mistral Saba和Pixtral多模态模型

怎么赚钱?

Mistral的策略是"开源获客 + 付费变现":

  • 免费开源:7B到8x22B参数的小模型,在Hugging Face下载量超5000万
  • 付费API:企业级API服务和私有化部署
  • 云平台合作:微软Azure和AWS上都可以使用

MoE架构使得推理成本显著低于同级别稠密模型——这是核心竞争优势。


关键数据

估值约60亿欧元,A轮3.85亿欧元,团队约200人,Mistral 7B下载量超5000万次。

Comments

Loading comments...