xAI's Grok 3 Pretraining Completed on 100k H100 Cluster with 10 Times More Compute
Jan 4, 2025, 02:16 AM
Elon Musk's xAI has completed the pretraining phase for its upcoming AI model, Grok 3, which utilized 10 times more computational power than its predecessor, Grok 2. The pretraining was completed on a 100k H100 cluster. Musk announced that Grok 3 is set to be released soon, with expectations that it will be the most powerful AI in the world. The development of Grok 3 is part of xAI's ongoing efforts to advance AI technology, with the company reportedly working towards linking 1,000,000 GPUs together.
View original story
Consumer applications • 25%
Scientific research • 25%
Other • 25%
Enterprise solutions • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
Finance • 25%
Other • 25%
Healthcare • 25%
Autonomous Vehicles • 25%
More than $70 billion • 25%
Less than $50 billion • 25%
$50 billion to $60 billion • 25%
$60 billion to $70 billion • 25%
No significant change • 25%
Significant growth • 25%
Moderate growth • 25%
Decline • 25%
Top 10 • 25%
Outside Top 20 • 25%
Top 20 • 25%
Top 3 • 25%
Asia • 25%
North America • 25%
Europe • 25%
Other • 25%
Other • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%