Loading...
Loading...
Browse all stories on DeepNewz
VisitIBM Launches Granite Code AI Models, 3B-34B Parameters, Open-Sources for Developers
May 7, 2024, 11:55 PM
IBM has released a new family of open-source code models named Granite Code, available in sizes ranging from 3 billion to 34 billion parameters. These models are trained on 116 programming languages using 3 to 4.5 trillion tokens, with a pre-training process on The Stack and filtering out low-quality code. The models, which include both base and instruction versions, are licensed under Apache 2.0 and are designed to enhance code generative tasks for developers. IBM's Granite Code models, particularly the 8 billion parameter version, have demonstrated superior performance on benchmarks compared to other open LLMs like CodeGemma or Mistral.
View original story
Markets
Yes • 50%
No • 50%
Tech industry benchmarking reports and official product announcements
No • 50%
Yes • 50%
GitHub usage statistics and developer surveys
Yes • 50%
No • 50%
Official IBM announcements and tech news coverage
Entertainment • 20%
Finance • 20%
Healthcare • 20%
Automotive • 20%
Education • 20%
Industry reports, developer surveys, and usage data
16 billion parameters • 25%
34 billion parameters • 25%
3 billion parameters • 25%
8 billion parameters • 25%
Developer usage data, tech blogs, and GitHub stars
Java • 20%
Go • 20%
JavaScript • 20%
C# • 20%
Python • 20%
Programming language usage statistics on GitHub and developer surveys