Loading...
Loading...
Browse all stories on DeepNewz
VisitWill OpenDiLoCo be used to train a model with over 2 billion parameters by end of 2024?
Yes • 50%
No • 50%
Research publications and official announcements
Prime Intellect Introduces OpenDiLoCo for 1.1 Billion Parameter AI Training with 90-95% Utilization
Jul 11, 2024, 05:56 PM
Prime Intellect has introduced OpenDiLoCo, an open-source implementation and scaling of DeepMind’s Distributed Low-Communication (DiLoCo) method. This new framework enables globally distributed AI model training and has been tested across three countries with 90-95% compute utilization. The OpenDiLoCo framework allows nodes to sync every 500 steps, significantly reducing the need for close proximity. The team successfully trained a 1.1 billion parameter model, three times the size of the original DeepMind work, using a hybrid code with torch FSDP and hivemind. The training was conducted with a bandwidth of less than 100mb/s. This development marks a significant step towards making decentralized AI training more accessible and efficient.
View original story
Yes • 50%
No • 50%
Natural Language Processing Model • 25%
Computer Vision Model • 25%
Reinforcement Learning Model • 25%
Other • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
Cerebras Systems • 25%
Meta • 25%
OpenAI • 25%
Other • 25%
USA • 33%
China • 33%
Germany • 33%
Other • 1%
Meta • 25%
OpenAI • 25%
Anthropic • 25%
Other • 25%
Stable Diffusion 3.0 • 25%
GPT-5 • 25%
DALL-E 3 • 25%
Other • 25%
94-96% • 25%
90-92% • 25%
96% or higher • 25%
92-94% • 25%
Other • 25%
Amazon • 25%
Google • 25%
Microsoft • 25%