Loading...
Loading...
Browse all stories on DeepNewz
VisitMistral AI Launches Codestral-22B Code Model with 32K Context Length, Outperforms Larger Models
May 29, 2024, 02:56 PM
Mistral AI has launched its first-ever code model, Codestral-22B, which is designed for code generation tasks. The 22B dense model is trained on more than 80 programming languages and boasts a 32K context length. Codestral-22B outperforms many larger models, including LLaMA 3 70B and DeepSeek Coder 33B, in various benchmarks such as RepoBench and HumanEval. The open-weight model is available on Mistral's API platform through instruct and can be tried for free on Le Chat. Additionally, Mistral has introduced a new Mistral AI Non-Production License (MNPL) to allow developers to use the model for non-commercial purposes and research. Codestral-22B is also integrated with VS Code, has fill-in-the-middle capabilities, and is available on Continue.
View original story
Below 65% • 25%
65% - 70% • 25%
70% - 75% • 25%
Above 75% • 25%
Increases • 33%
Decreases • 33%
Remains the same • 33%
Below $750 million • 33%
$750 million to $1 billion • 34%
Above $1 billion • 33%
Below $1 billion • 33%
$1 billion • 34%
Above $1 billion • 33%
GPT-4 • 25%
PaLM 2 • 25%
Claude • 25%
LLaMA • 25%
Falcon 2 • 33%
Meta's Llama 3 • 33%
OpenAI's latest model • 34%
Exceeds expectations • 33%
Meets expectations • 34%
Falls below expectations • 33%
HP • 33%
A telecom company • 33%
Other • 34%