Loading...
Loading...
Browse all stories on DeepNewz
VisitPerformance of Codestral-22B vs other models by end of 2025
Remains superior • 33%
Equal performance • 33%
Outperformed by newer models • 33%
Independent tech reviews and benchmarking reports
Mistral AI Launches Codestral-22B, a New Code Model Fluent in 80 Programming Languages with 32K Context Length
May 29, 2024, 02:22 PM
Mistral AI has launched its first-ever code model, Codestral-22B, which is trained on more than 80 programming languages and features a 32K context length. The open-weight model outperforms previous code models, including the largest ones, and is available through Mistral's API platform. Codestral-22B is also accessible for free on Le Chat and is integrated with VS Code. Additionally, Mistral AI introduced the Mistral AI Non-Production License (MNPL), allowing developers to use the technology for non-commercial use and research. The model is designed for code generation tasks and aims to balance openness with business growth. Users can also try Codestral-22B via La Plateforme.
View original story
LLaMA 3 70B • 33%
DeepSeek 33B • 33%
A new entrant • 34%
Performance • 25%
Context length • 25%
Language support • 25%
Licensing model • 25%
Research and academic projects • 33%
Open-source software development • 33%
Prototype and experimental coding • 34%
1st place • 33%
2nd place • 33%
3rd place • 34%
Tech • 25%
Finance • 25%
Healthcare • 25%
Other • 25%