Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Cerebras secure a major cloud provider partnership by mid-2025?
Yes • 50%
No • 50%
Official press releases and announcements from Cerebras Systems and the cloud provider
Cerebras Challenges Nvidia with Fastest AI Inference Service
Aug 27, 2024, 04:05 PM
Cerebras Systems has launched what it claims to be the world's fastest AI inference service, marking a significant challenge to Nvidia's dominance in the AI computing sector. The new service, powered by Cerebras' custom wafer-scale AI accelerator chips, offers AI developers access to high-speed applications at a lower cost. Cerebras has set a new record for AI inference speed, serving Llama 3.1 8B at 1,850 output tokens per second and 70B at 446 output tokens per second. This move is part of a broader trend where several chipmakers are attempting to break Nvidia's stronghold in the AI market.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Google DeepMind • 25%
OpenAI • 25%
Microsoft Research • 25%
Other • 25%
G42 • 25%
Another tech company • 25%
Multiple customers with no primary • 25%
Other • 25%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
Nvidia • 25%
Other • 25%
Google • 25%
Cerebras • 25%
Cerebras • 25%
Other • 25%
Nvidia • 25%
Intel • 25%