Loading...
Loading...
Browse all stories on DeepNewz
VisitWill OpenDiLoCo be adopted by at least 10 major AI research labs by end of 2024?
Yes • 50%
No • 50%
Public announcements and official websites of AI research labs
Prime Intellect Introduces OpenDiLoCo for 1.1 Billion Parameter AI Training with 90-95% Utilization
Jul 11, 2024, 05:56 PM
Prime Intellect has introduced OpenDiLoCo, an open-source implementation and scaling of DeepMind’s Distributed Low-Communication (DiLoCo) method. This new framework enables globally distributed AI model training and has been tested across three countries with 90-95% compute utilization. The OpenDiLoCo framework allows nodes to sync every 500 steps, significantly reducing the need for close proximity. The team successfully trained a 1.1 billion parameter model, three times the size of the original DeepMind work, using a hybrid code with torch FSDP and hivemind. The training was conducted with a bandwidth of less than 100mb/s. This development marks a significant step towards making decentralized AI training more accessible and efficient.
View original story
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
Yes • 50%
No • 50%
Natural Language Processing Model • 25%
Computer Vision Model • 25%
Reinforcement Learning Model • 25%
Other • 25%
1-5 • 25%
6-10 • 25%
11-15 • 25%
More than 15 • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
94-96% • 25%
90-92% • 25%
96% or higher • 25%
92-94% • 25%
Other • 25%
Amazon • 25%
Google • 25%
Microsoft • 25%