Loading...
Loading...
Browse all stories on DeepNewz
VisitWho will be Apple's primary AI training hardware provider in 2025?
Google TPUs • 25%
Nvidia GPUs • 25%
Apple's own hardware • 25%
Other • 25%
Official announcements or technical papers from Apple
Apple's Foundation Language Models Trained on Google's TPUs, Marking Shift from Nvidia GPUs
Jul 29, 2024, 09:19 PM
Apple has disclosed that its AI models, which power the Apple Intelligence features, were trained using Google's Tensor Processing Units (TPUs). The technical paper released by Apple details the use of Google's TPUs for both on-device and server versions of the models. Specifically, the on-device models were trained using 2048 TPU v5p chips, while the server models utilized 8192 TPU v4 chips. This marks a significant shift from the industry-standard Nvidia GPUs, which were not used in any part of the training process. The move highlights a growing trend among major tech companies to explore alternatives to Nvidia for AI training. Apple's Foundation Language Models were developed with a responsible approach to AI training.
View original story
Google TPUs • 25%
Nvidia GPUs • 25%
AMD GPUs • 25%
Other • 25%
Google TPUv5p • 25%
Nvidia A100 • 25%
AMD MI200 • 25%
Other • 25%
Google leads • 25%
Nvidia leads • 25%
AMD leads • 25%
Other leads • 25%
Meta • 25%
Anthropic • 25%
Perplexity • 25%
Other • 25%
Trained on Google's TPUs • 33%
Trained on Nvidia's GPUs • 33%
Trained on other hardware • 33%
Google • 25%
Microsoft • 25%
Amazon • 25%
Samsung • 25%
OpenAI • 25%
Microsoft • 25%
Google • 25%
Other • 25%
AMD • 25%
Intel • 25%
Google • 25%
Other • 25%
AMD • 25%
Intel • 25%
Google • 25%
Other • 25%
Google TPUs • 25%
Other • 25%
Apple's own hardware • 25%
Nvidia GPUs • 25%