Loading...
Loading...
Browse all stories on DeepNewz
VisitNous Research's DisTrO Cuts AI Training Costs by 1,000 to 10,000 Times
Aug 26, 2024, 08:39 PM
Nous Research has released a preliminary report on DisTrO (Distributed Training Over-the-Internet), a new approach to AI model training that leverages decentralized computing power. This method significantly reduces communication requirements by 1,000 to 10,000 times, allowing for effective training of AI models without relying on centralized entities. The technology is architecture-agnostic and network-agnostic, enabling the use of low bandwidth connections between heterogeneous hardware. This breakthrough could revolutionize AI by making it more accessible and open-sourced, akin to a SETI@home model for AI training.
View original story
Markets
Yes • 50%
No • 50%
Financial reports or official statements from major AI companies or research organizations
No • 50%
Yes • 50%
Announcements from major AI conferences, award ceremonies, or AI research publications
No • 50%
Yes • 50%
Official announcements from major AI research organizations (e.g., OpenAI, DeepMind, Google AI)
Natural Language Processing • 25%
Other • 25%
Robotics • 25%
Computer Vision • 25%
Official announcements or reports from AI organizations
OpenAI • 25%
Other • 25%
Google AI • 25%
DeepMind • 25%
Official announcements from AI companies
Other • 25%
NeurIPS • 25%
ICML • 25%
AAAI • 25%
Schedules and announcements from major AI conferences