Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Nous Research's 15B model outperform GPT-4 in benchmarks by end of 2024?
Yes • 50%
No • 50%
Peer-reviewed AI benchmark studies or reputable AI research publications
Nous Research Pre-Trains 15B Model with DeMo and Heterogeneous Hardware
Dec 2, 2024, 05:03 PM
Nous Research has announced the successful pre-training of a 15-billion parameter language model using distributed training over the internet. The model, trained with Nous DisTrO (now DeMo), utilized heterogeneous hardware contributed by partners such as Oracle, LambdaAPI, Northern Data Group, Crusoe Cloud, and the Andromeda Cluster. The training achieved a loss curve and convergence rate that meets or exceeds those of centralized training methods, demonstrating the viability of decentralized training at scale.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
HumanEval • 25%
MMLU_social_sciences • 25%
Both • 25%
Neither • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
31B to 50B • 25%
More than 50B • 25%
Less than 20B • 25%
20B to 30B • 25%
Google • 25%
Oracle • 25%
Amazon • 25%
Microsoft • 25%