Published on January 16, 2026 by Dominic Böttger · 4 min read
Mistral AI just dropped something exciting for developers: Vibe CLI and Devstral 2. If you’ve been looking for an open-source alternative to closed AI coding assistants, this release deserves your attention.
What is Devstral 2?
Devstral 2 is Mistral’s next-generation open-source coding model family, available in two sizes:
| Model | Parameters | License | SWE-bench Score |
|---|---|---|---|
| Devstral 2 | 123B | Modified MIT | 72.2% |
| Devstral Small 2 | 24B | Apache 2.0 | 68.0% |
Both models feature a massive 256K context window, which means they can understand and work with entire codebases, not just individual files.
The smaller model is particularly interesting - it supports image inputs for multimodal agents and can run on consumer hardware, including single-GPU systems. If you have a GeForce RTX card, you can run this locally.
How Does It Compare?
Mistral claims Devstral 2 is 5-28x smaller than DeepSeek V3.2 and 8-41x smaller than Kimi K2, while achieving competitive results. In human evaluations, it scored a 42.8% win rate against DeepSeek V3.2 with only a 28.6% loss rate.
Perhaps more importantly for production use: it’s reportedly up to 7x more cost-efficient than Claude Sonnet for real-world tasks.
Introducing Vibe CLI
Vibe CLI is where things get practical. It’s an open-source command-line coding assistant (Apache 2.0 license) that brings Devstral’s capabilities directly to your terminal.
Installation
Getting started is a one-liner:
curl -LsSf https://mistral.ai/vibe/install.sh | bash
That’s it. You’re ready to go.
Key Features
Project-Aware Context: Vibe automatically scans your file structure and Git status. It understands your project layout without manual configuration.
Smart References: Use @ to autocomplete file paths, ! to run shell commands, and slash commands for configuration. It feels natural if you’ve used other modern CLI tools.
Multi-File Orchestration: This is where Devstral’s 256K context window shines. Vibe can reason about architecture-level changes across your entire codebase, not just the file you’re looking at.
IDE Integration: If you use Zed, there’s already a native extension available.
Practical Capabilities
- Explore codebases and orchestrate multi-file changes
- Track framework dependencies automatically
- Detect failures and retry with corrections
- Supports fine-tuning for specific languages or enterprise codebases
Pricing
During the launch period, both models are free via API.
Post-launch pricing will be:
| Model | Input | Output |
|---|---|---|
| Devstral 2 | $0.40/M tokens | $2.00/M tokens |
| Devstral Small 2 | $0.10/M tokens | $0.30/M tokens |
These are competitive rates, especially for the smaller model.
Deployment Options
Devstral 2 (123B): Requires minimum 4 H100-class GPUs for data center deployment. Available on build.nvidia.com.
Devstral Small 2 (24B): Much more accessible - supports single-GPU operation on NVIDIA DGX Spark, GeForce RTX, and even CPU-only configurations. NVIDIA NIM support is coming soon.
For best results, Mistral recommends a temperature setting of 0.2.
My Take
The combination of permissive licensing (Apache 2.0 for the small model), competitive benchmarks, and practical tooling makes this release significant. Vibe CLI isn’t trying to be a fancy IDE plugin - it’s a focused terminal tool that does one thing well.
The 24B model running on consumer hardware is particularly exciting for local development. No API calls, no latency, no privacy concerns about your code leaving your machine.
That said, for my personal workflow I’m missing two features that have become essential: skills and agents. Tools like Claude Code allow you to define reusable skills (predefined prompts and workflows) and spawn specialized agents for specific tasks. This extensibility makes a huge difference when you’re working on complex projects with recurring patterns. Vibe CLI feels capable but somewhat monolithic in comparison - you get the model, but not the scaffolding to customize and extend its behavior.
Hopefully these features are on the roadmap. The foundation is solid.
If you’re already comfortable with terminal-based workflows, give Vibe a try:
curl -LsSf https://mistral.ai/vibe/install.sh | bash
The barrier to entry is low, and the models are free during launch. Worth experimenting with on your next project.
Resources
Written by Dominic Böttger
← Back to blog