Loading...
Loading...
Browse all stories on DeepNewz
VisitWill OpenDiLoCo achieve more than 95% compute utilization in global AI model training by end of 2024?
Yes • 50%
No • 50%
Official reports or announcements from PrimeIntellect
PrimeIntellect Open-Sources OpenDiLoCo, Scaling Up DeepMind’s DiLoCo for Global AI Training
Jul 11, 2024, 04:59 PM
DeepMind's Distributed Low-Communication (DiLoCo) method has been open-sourced and scaled up by PrimeIntellect. The new framework, OpenDiLoCo, integrates with FDSP and enables globally distributed AI model training with 90-95% compute utilization across three countries. This development marks a significant step towards decentralized training for AI models, making it more accessible and scalable. PrimeIntellect's work replicates and extends DeepMind's research, tripling the size of the original experiments. Further work is needed as DDP outperforms in some cases.
View original story
Yes • 50%
No • 50%
90-92% • 25%
92-94% • 25%
94-96% • 25%
96% or higher • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
1-5 • 25%
6-10 • 25%
11-15 • 25%
More than 15 • 25%
No • 50%
Yes • 50%
Natural Language Processing Model • 25%
Other • 25%
Reinforcement Learning Model • 25%
Computer Vision Model • 25%
China • 33%
Other • 1%
USA • 33%
Germany • 33%