Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Nous Research's 15B model be available for licensing by June 30, 2025?
Yes • 50%
No • 50%
Nous Research official announcements or press releases
Nous Research Pre-Trains 15B Model with DeMo and Heterogeneous Hardware
Dec 2, 2024, 05:03 PM
Nous Research has announced the successful pre-training of a 15-billion parameter language model using distributed training over the internet. The model, trained with Nous DisTrO (now DeMo), utilized heterogeneous hardware contributed by partners such as Oracle, LambdaAPI, Northern Data Group, Crusoe Cloud, and the Andromeda Cluster. The training achieved a loss curve and convergence rate that meets or exceeds those of centralized training methods, demonstrating the viability of decentralized training at scale.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
31B to 50B • 25%
More than 50B • 25%
Less than 20B • 25%
20B to 30B • 25%
Google • 25%
Oracle • 25%
Amazon • 25%
Microsoft • 25%