Loading...
Loading...
Browse all stories on DeepNewz
VisitIBM Launches Granite Code AI Models, 3B-34B Parameters, Open-Sources for Developers
May 7, 2024, 11:55 PM
IBM has released a new family of open-source code models named Granite Code, available in sizes ranging from 3 billion to 34 billion parameters. These models are trained on 116 programming languages using 3 to 4.5 trillion tokens, with a pre-training process on The Stack and filtering out low-quality code. The models, which include both base and instruction versions, are licensed under Apache 2.0 and are designed to enhance code generative tasks for developers. IBM's Granite Code models, particularly the 8 billion parameter version, have demonstrated superior performance on benchmarks compared to other open LLMs like CodeGemma or Mistral.
View original story