Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat utilization rate will Prime Intellect achieve in next OpenDiLoCo demo by end of 2024?
90-92% • 25%
92-94% • 25%
94-96% • 25%
96% or higher • 25%
Research publications and official announcements from Prime Intellect
Prime Intellect Introduces OpenDiLoCo for 1.1 Billion Parameter AI Training with 90-95% Utilization
Jul 11, 2024, 05:56 PM
Prime Intellect has introduced OpenDiLoCo, an open-source implementation and scaling of DeepMind’s Distributed Low-Communication (DiLoCo) method. This new framework enables globally distributed AI model training and has been tested across three countries with 90-95% compute utilization. The OpenDiLoCo framework allows nodes to sync every 500 steps, significantly reducing the need for close proximity. The team successfully trained a 1.1 billion parameter model, three times the size of the original DeepMind work, using a hybrid code with torch FSDP and hivemind. The training was conducted with a bandwidth of less than 100mb/s. This development marks a significant step towards making decentralized AI training more accessible and efficient.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Healthcare • 25%
Finance • 25%
Education • 25%
Technology • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
Natural Language Processing Model • 25%
Computer Vision Model • 25%
Reinforcement Learning Model • 25%
Other • 25%
Healthcare • 25%
Finance • 25%
Education • 25%
Other • 25%
No • 50%
Yes • 50%
Other • 25%
Amazon • 25%
Google • 25%
Microsoft • 25%