Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Nvidia announce a new AI inference service to compete with Cerebras by Q1 2025?
Yes • 50%
No • 50%
Official announcements from Nvidia
Cerebras Launches AI Inference Service, 20x Faster with 1,850 Tokens/sec for 8B Model
Aug 27, 2024, 05:22 PM
Cerebras Systems has launched what it claims to be the world's fastest AI inference service, directly challenging Nvidia's dominance in the AI computing sector. The new service, known as Cerebras Inference, utilizes the company's custom wafer-scale AI accelerator chips to achieve significant performance gains. It is capable of processing Llama 3.1 models at impressive speeds: 1,850 tokens per second for the 8B model and 450 tokens per second for the 70B model. Cerebras asserts that its inference tool offers a 20x speed increase over traditional GPU-based systems, and it is priced at 60 cents per million tokens, which is a fifth of the cost compared to hyperscalers. The launch aims to provide a cost-effective and efficient alternative to Nvidia's GPUs, with claims of delivering 100x better price-performance.
View original story
Cerebras significantly outperforms Nvidia • 25%
Nvidia significantly outperforms Cerebras • 25%
Nvidia slightly outperforms Cerebras • 25%
Cerebras slightly outperforms Nvidia • 25%
Nvidia offers significantly better price-performance • 25%
Cerebras offers significantly better price-performance • 25%
Cerebras offers slightly better price-performance • 25%
Nvidia offers slightly better price-performance • 25%