Loading...
Loading...
Browse all stories on DeepNewz
VisitMistral AI Launches Codestral-22B Code Model with 32K Context Length, Outperforms Larger Models
May 29, 2024, 02:56 PM
Mistral AI has launched its first-ever code model, Codestral-22B, which is designed for code generation tasks. The 22B dense model is trained on more than 80 programming languages and boasts a 32K context length. Codestral-22B outperforms many larger models, including LLaMA 3 70B and DeepSeek Coder 33B, in various benchmarks such as RepoBench and HumanEval. The open-weight model is available on Mistral's API platform through instruct and can be tried for free on Le Chat. Additionally, Mistral has introduced a new Mistral AI Non-Production License (MNPL) to allow developers to use the model for non-commercial purposes and research. Codestral-22B is also integrated with VS Code, has fill-in-the-middle capabilities, and is available on Continue.
View original story
Remains superior • 33%
Equal performance • 33%
Outperformed by newer models • 33%
LLaMA 3 70B • 33%
DeepSeek 33B • 33%
A new entrant • 34%
Tech • 25%
Finance • 25%
Healthcare • 25%
Other • 25%
Mistral's API • 33%
Le Chat • 33%
VS Code integration • 33%
Web Development • 25%
Machine Learning Projects • 25%
Data Analysis • 25%
Mobile App Development • 25%