Loading...
Loading...
Browse all stories on DeepNewz
VisitElon Musk Activates Memphis Supercluster with 100,000 GPUs to Train Grok 3.0
Jul 22, 2024, 07:13 PM
Elon Musk has announced the activation of the Memphis Supercluster, now the world's most powerful AI training cluster. This facility, a collaboration between xAI, X, and Nvidia, features 100,000 liquid-cooled H100 GPUs on a single RDMA fabric. The cluster began training at 4:10 a.m. local time in Memphis, Tennessee, and was installed in just 19 days. The Memphis Supercluster will be used to train Grok 3.0, which is expected to be the most powerful AI model in the world upon its release in December 2024. Grok 2.0 is anticipated to launch next month.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
Grok 3.0 • 25%
Other • 25%
Google DeepMind's latest model • 25%
OpenAI GPT-5 • 25%
X • 25%
Other • 25%
xAI • 25%
Nvidia • 25%