Loading...
Loading...
Browse all stories on DeepNewz
VisitElon Musk Activates Memphis Supercluster with 100,000 GPUs to Train Grok 3.0
Jul 22, 2024, 07:13 PM
Elon Musk has announced the activation of the Memphis Supercluster, now the world's most powerful AI training cluster. This facility, a collaboration between xAI, X, and Nvidia, features 100,000 liquid-cooled H100 GPUs on a single RDMA fabric. The cluster began training at 4:10 a.m. local time in Memphis, Tennessee, and was installed in just 19 days. The Memphis Supercluster will be used to train Grok 3.0, which is expected to be the most powerful AI model in the world upon its release in December 2024. Grok 2.0 is anticipated to launch next month.
View original story
Markets
No • 50%
Yes • 50%
Official announcements from xAI, X, or Elon Musk's social media accounts
Yes • 50%
No • 50%
AI model performance benchmarks from reputable sources such as OpenAI, Google AI, or MIT Technology Review
No • 50%
Yes • 50%
Official rankings and reports from AI research organizations and technology news outlets
Grok 3.0 • 25%
Other • 25%
Google DeepMind's latest model • 25%
OpenAI GPT-5 • 25%
AI model performance benchmarks from reputable sources such as OpenAI, Google AI, or MIT Technology Review
X • 25%
Other • 25%
xAI • 25%
Nvidia • 25%
Official financial reports from xAI, X, Nvidia, and other relevant companies
Other • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Official announcements from AI and tech companies