Loading...
Loading...
Browse all stories on DeepNewz
VisitxAI's Grok 3 Pretraining Completed on 100k H100 Cluster with 10 Times More Compute
Jan 4, 2025, 02:16 AM
Elon Musk's xAI has completed the pretraining phase for its upcoming AI model, Grok 3, which utilized 10 times more computational power than its predecessor, Grok 2. The pretraining was completed on a 100k H100 cluster. Musk announced that Grok 3 is set to be released soon, with expectations that it will be the most powerful AI in the world. The development of Grok 3 is part of xAI's ongoing efforts to advance AI technology, with the company reportedly working towards linking 1,000,000 GPUs together.
View original story
No • 50%
Yes • 50%
Remains at 100,000 GPUs • 25%
Decreases GPU count • 25%
Increases to 150,000 GPUs • 25%
Increases to 200,000 GPUs • 25%
Amazon • 25%
Google • 25%
Microsoft • 25%
Other • 25%
Increase by 20% or more • 25%
Decrease • 25%
No change • 25%
Increase by less than 20% • 25%
Other • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%