Loading...
Loading...
Browse all stories on DeepNewz
VisitxAI's Grok 3 Pretraining Completed on 100k H100 Cluster with 10 Times More Compute
Jan 4, 2025, 02:16 AM
Elon Musk's xAI has completed the pretraining phase for its upcoming AI model, Grok 3, which utilized 10 times more computational power than its predecessor, Grok 2. The pretraining was completed on a 100k H100 cluster. Musk announced that Grok 3 is set to be released soon, with expectations that it will be the most powerful AI in the world. The development of Grok 3 is part of xAI's ongoing efforts to advance AI technology, with the company reportedly working towards linking 1,000,000 GPUs together.
View original story
Top 10 • 25%
Outside Top 20 • 25%
Top 20 • 25%
Top 3 • 25%
Other • 25%
Scientific research • 25%
Enterprise solutions • 25%
Consumer applications • 25%
Anthropic's Claude • 25%
Google Bard • 25%
OpenAI's ChatGPT • 25%
Other • 25%
Entertainment • 25%
Finance • 25%
Healthcare • 25%
Other • 25%
Increase by 20% or more • 25%
Decrease • 25%
No change • 25%
Increase by less than 20% • 25%
Other • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%