Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Cerebras announce a partnership with a major cloud provider by the end of 2024?
Yes • 50%
No • 50%
Press releases and official announcements from Cerebras or the cloud provider
Cerebras Launches World's Fastest AI Inference Tool, 20x Faster Than Nvidia
Aug 27, 2024, 04:04 PM
Cerebras Systems has launched a new AI inference tool that aims to challenge Nvidia's dominance in the AI computing market. The startup claims its new service, known as Cerebras Inference, is the world's fastest AI inference service. It boasts significant performance advantages, including processing speeds of 1,850 tokens per second for Llama 3.1 8B models and 446 tokens per second for 70B models, with a rate of 450 tokens per second for some configurations. The service is priced at 60 cents per million tokens, which is a fifth of the cost offered by hyperscalers, and offers full 16-bit precision for model accuracy. Cerebras' tool is reportedly 20 times faster than Nvidia's GPUs and twice as fast as those from Groq, making it a competitive alternative for AI developers. The service leverages Cerebras' custom waferscale chips to achieve these performance metrics.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Microsoft • 25%
Google • 25%
Amazon • 25%
Other • 25%
Yes • 50%
No • 50%
OpenAI • 25%
Microsoft • 25%
Google • 25%
Other • 25%
No • 50%
Yes • 50%
Yes • 50%
No • 50%
Groq • 25%
Other • 25%
Cerebras • 25%
Nvidia • 25%
Groq • 25%
Cerebras • 25%
Other • 25%
Nvidia • 25%