Loading...
Loading...
Browse all stories on DeepNewz
VisitWhich continent will have the most nodes in OpenDiLoCo training by end of 2024?
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
Data from Prime Intellect's reports and research publications
Prime Intellect Introduces OpenDiLoCo for 1.1 Billion Parameter AI Training with 90-95% Utilization
Jul 11, 2024, 05:56 PM
Prime Intellect has introduced OpenDiLoCo, an open-source implementation and scaling of DeepMind’s Distributed Low-Communication (DiLoCo) method. This new framework enables globally distributed AI model training and has been tested across three countries with 90-95% compute utilization. The OpenDiLoCo framework allows nodes to sync every 500 steps, significantly reducing the need for close proximity. The team successfully trained a 1.1 billion parameter model, three times the size of the original DeepMind work, using a hybrid code with torch FSDP and hivemind. The training was conducted with a bandwidth of less than 100mb/s. This development marks a significant step towards making decentralized AI training more accessible and efficient.
View original story
USA • 33%
China • 33%
Germany • 33%
Other • 1%
Africa • 25%
Asia • 25%
South America • 25%
Other • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
South America • 25%
Africa • 25%
Asia • 25%
South America • 25%
Other • 25%
Europe • 25%
Asia • 25%
South America • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
Europe • 25%
Asia • 25%
South America • 25%
Other • 25%
Europe • 25%
Asia • 25%
South America • 25%
Africa • 25%
Europe • 25%
Asia • 25%
South America • 25%
Africa • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
No • 50%
Yes • 50%
94-96% • 25%
90-92% • 25%
96% or higher • 25%
92-94% • 25%