Loading...
Loading...
Browse all stories on DeepNewz
VisitNous Research Pre-Trains 15B Model with DeMo and Heterogeneous Hardware
Dec 2, 2024, 05:03 PM
Nous Research has announced the successful pre-training of a 15-billion parameter language model using distributed training over the internet. The model, trained with Nous DisTrO (now DeMo), utilized heterogeneous hardware contributed by partners such as Oracle, LambdaAPI, Northern Data Group, Crusoe Cloud, and the Andromeda Cluster. The training achieved a loss curve and convergence rate that meets or exceeds those of centralized training methods, demonstrating the viability of decentralized training at scale.
View original story
Markets
No • 50%
Yes • 50%
Official announcements from Nous Research or partner companies
Yes • 50%
No • 50%
Nous Research official announcements or press releases
No • 50%
Yes • 50%
Peer-reviewed AI benchmark studies or reputable AI research publications
31B to 50B • 25%
More than 50B • 25%
Less than 20B • 25%
20B to 30B • 25%
Nous Research official announcements or press releases
Google • 25%
Oracle • 25%
Amazon • 25%
Microsoft • 25%
Official announcements from Nous Research or the tech company
Other • 25%
Healthcare • 25%
Finance • 25%
Education • 25%
Nous Research's official reports or industry analysis publications