Loading...
Loading...
Browse all stories on DeepNewz
VisitWill a new collaboration between Nous Research and training partners be announced by March 31, 2025?
Yes • 50%
No • 50%
Official announcements from Nous Research or partner companies
Nous Research Pre-Trains 15B Model with DeMo and Heterogeneous Hardware
Dec 2, 2024, 05:03 PM
Nous Research has announced the successful pre-training of a 15-billion parameter language model using distributed training over the internet. The model, trained with Nous DisTrO (now DeMo), utilized heterogeneous hardware contributed by partners such as Oracle, LambdaAPI, Northern Data Group, Crusoe Cloud, and the Andromeda Cluster. The training achieved a loss curve and convergence rate that meets or exceeds those of centralized training methods, demonstrating the viability of decentralized training at scale.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
31B to 50B • 25%
More than 50B • 25%
Less than 20B • 25%
20B to 30B • 25%
Google • 25%
Oracle • 25%
Amazon • 25%
Microsoft • 25%