Loading...
Loading...
Browse all stories on DeepNewz
VisitPrime Intellect Introduces OpenDiLoCo for 1.1 Billion Parameter AI Training with 90-95% Utilization
Jul 11, 2024, 05:56 PM
Prime Intellect has introduced OpenDiLoCo, an open-source implementation and scaling of DeepMind’s Distributed Low-Communication (DiLoCo) method. This new framework enables globally distributed AI model training and has been tested across three countries with 90-95% compute utilization. The OpenDiLoCo framework allows nodes to sync every 500 steps, significantly reducing the need for close proximity. The team successfully trained a 1.1 billion parameter model, three times the size of the original DeepMind work, using a hybrid code with torch FSDP and hivemind. The training was conducted with a bandwidth of less than 100mb/s. This development marks a significant step towards making decentralized AI training more accessible and efficient.
View original story
Markets
Yes • 50%
No • 50%
Public announcements and official websites of AI research labs
No • 50%
Yes • 50%
Research publications and official announcements
No • 50%
Yes • 50%
Official announcements from Prime Intellect
94-96% • 25%
90-92% • 25%
96% or higher • 25%
92-94% • 25%
Research publications and official announcements from Prime Intellect
Other • 25%
Amazon • 25%
Google • 25%
Microsoft • 25%
Official announcements from companies
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Data from Prime Intellect's reports and research publications