Loading...
Loading...
Browse all stories on DeepNewz
VisitxAI's Grok 3 Pretraining Completed on 100k H100 Cluster with 10 Times More Compute
Jan 4, 2025, 02:16 AM
Elon Musk's xAI has completed the pretraining phase for its upcoming AI model, Grok 3, which utilized 10 times more computational power than its predecessor, Grok 2. The pretraining was completed on a 100k H100 cluster. Musk announced that Grok 3 is set to be released soon, with expectations that it will be the most powerful AI in the world. The development of Grok 3 is part of xAI's ongoing efforts to advance AI technology, with the company reportedly working towards linking 1,000,000 GPUs together.
View original story
Entertainment • 25%
Other • 25%
Healthcare • 25%
Finance • 25%
Autonomous Vehicles • 25%
Healthcare • 25%
Other • 25%
Finance • 25%
Retail • 25%
Finance • 25%
Healthcare • 25%
Education • 25%
Scientific research • 25%
Consumer applications • 25%
Enterprise solutions • 25%
Other • 25%
Healthcare • 25%
Other • 25%
Automotive • 25%
Media and Entertainment • 25%
Virtual Reality • 25%
Gaming • 25%
Other • 25%
AI Training • 25%
Other • 25%
Finance • 25%
Technology • 25%
Healthcare • 25%
Google • 25%
Microsoft • 25%
Other • 25%
Amazon • 25%
Other • 25%
Architecture • 25%
Film • 25%
Video Games • 25%
Finance • 25%
Healthcare • 25%
Other • 25%
Retail • 25%
Asia • 25%
Other • 25%
North America • 25%
Europe • 25%
Real Estate • 25%
Technology • 25%
Other • 25%
Banking • 25%
Increase by 20% or more • 25%
Decrease • 25%
No change • 25%
Increase by less than 20% • 25%