Loading...
Loading...
Browse all stories on DeepNewz
VisitWhich country will contribute the most compute resources to OpenDiLoCo by end of 2024?
USA • 33%
China • 33%
Germany • 33%
Other • 1%
Official reports or announcements from PrimeIntellect
PrimeIntellect Open-Sources OpenDiLoCo, Scaling Up DeepMind’s DiLoCo for Global AI Training
Jul 11, 2024, 04:59 PM
DeepMind's Distributed Low-Communication (DiLoCo) method has been open-sourced and scaled up by PrimeIntellect. The new framework, OpenDiLoCo, integrates with FDSP and enables globally distributed AI model training with 90-95% compute utilization across three countries. This development marks a significant step towards decentralized training for AI models, making it more accessible and scalable. PrimeIntellect's work replicates and extends DeepMind's research, tripling the size of the original experiments. Further work is needed as DDP outperforms in some cases.
View original story
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
United States • 25%
China • 25%
Japan • 25%
Other • 25%
United States • 25%
China • 25%
European Union • 25%
Other • 25%
Japan • 25%
USA • 25%
China • 25%
Other • 25%
HPE • 25%
IBM • 25%
Fujitsu • 25%
Other • 25%
United States • 25%
United Arab Emirates • 25%
Saudi Arabia • 25%
Other • 25%
NVIDIA • 25%
Dell • 25%
Super Micro Computer • 25%
Other • 25%
China • 25%
USA • 25%
European Union • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
United States • 25%
China • 25%
European Union • 25%
Other • 25%
USA • 25%
China • 25%
Germany • 25%
Other • 25%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
Natural Language Processing Model • 25%
Other • 25%
Reinforcement Learning Model • 25%
Computer Vision Model • 25%