Loading...
Loading...
Browse all stories on DeepNewz
VisitWhich coding language will benefit the most from Arctic-SnowCoder-1.3B's capabilities by the end of 2024?
Python • 25%
JavaScript • 25%
Java • 25%
Other • 25%
Official reports and benchmarks from Snowflake AI and other tech analysts
Snowflake AI's Arctic-SnowCoder-1.3B Sets SOTA with 36% Higher Performance in Code Models
Sep 6, 2024, 05:05 PM
Snowflake AI Research has introduced Arctic-SnowCoder-1.3B, a new 1.3 billion parameter model that sets the state-of-the-art (SOTA) among small language models for code. The model is trained in three phases: general pretraining on 500 billion tokens of raw code data, followed by continued pretraining on high-quality data, and finally, fine-tuning on domain-specific data. Arctic-SnowCoder-1.3B outperforms larger 1 trillion token models by 36% in code generation tasks. The model uses a total of 555 billion tokens in its training process. The research was conducted by Snowflake AI Research in collaboration with the University of Illinois at Urbana-Champaign, with contributions from Y Wei, H Han, and R Samdani.
View original story
Python • 25%
JavaScript • 25%
Java • 25%
Other • 25%
GitHub • 25%
GitLab • 25%
Bitbucket • 25%
Other • 25%
Rust • 25%
Go • 25%
Python • 25%
Other • 25%
English • 25%
Spanish • 25%
Mandarin • 25%
Other • 25%
Visual Studio Code • 25%
Xcode • 25%
GitHub Desktop • 25%
Other IDEs • 25%
GitHub • 25%
GitLab • 25%
Bitbucket • 25%
SourceForge • 25%
OpenAI o1-preview • 25%
Anthropic Claude 3.5 Sonnet • 25%
OpenAI o1 mini • 25%
Other • 25%
OpenAI • 25%
Anthropic • 25%
Google • 25%
Microsoft • 25%
Marathi • 20%
Tamil • 20%
Telugu • 20%
Kannada • 20%
Malayalam • 20%
Software Development • 25%
Customer Support • 25%
Education • 25%
Other • 25%
OpenAI • 25%
Google • 25%
Microsoft • 25%
Other • 25%
Rust • 25%
C • 25%
C++ • 25%
Other • 25%
Yes • 50%
No • 50%
Google • 25%
Other • 25%
Microsoft • 25%
Amazon • 25%