Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat will be the parameter size of Nous Research's next announced model by end of 2025?
Less than 20B • 25%
20B to 30B • 25%
31B to 50B • 25%
More than 50B • 25%
Nous Research official announcements or press releases
Nous Research Pre-Trains 15B Model with DeMo and Heterogeneous Hardware
Dec 2, 2024, 05:03 PM
Nous Research has announced the successful pre-training of a 15-billion parameter language model using distributed training over the internet. The model, trained with Nous DisTrO (now DeMo), utilized heterogeneous hardware contributed by partners such as Oracle, LambdaAPI, Northern Data Group, Crusoe Cloud, and the Andromeda Cluster. The training achieved a loss curve and convergence rate that meets or exceeds those of centralized training methods, demonstrating the viability of decentralized training at scale.
View original story
15B Parameters • 25%
20B Parameters • 25%
25B Parameters • 25%
Other • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%
Other • 25%
Yes • 50%
No • 50%
Finance • 25%
Healthcare • 25%
E-commerce • 25%
Other • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Google AI • 25%
OpenAI • 25%
DeepMind • 25%
Other • 25%
Google DeepMind • 25%
OpenAI • 25%
Microsoft • 25%
Other • 25%
Cerebras Systems • 25%
Meta • 25%
OpenAI • 25%
Other • 25%
Yes • 50%
No • 50%
Less than 20% • 25%
20% to 40% • 25%
40% to 60% • 25%
More than 60% • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
No • 50%
Yes • 50%
Google • 25%
Oracle • 25%
Amazon • 25%
Microsoft • 25%