Loading...
Loading...
Browse all stories on DeepNewz
VisitWill SambaNova Cloud be adopted by top 100 AI companies by March 31, 2025?
Yes • 50%
No • 50%
Publicly available reports and press releases from top 100 AI companies
SambaNova Launches Fastest AI Inference Platform for Llama 3.1 at 570 Tokens/Second
Sep 10, 2024, 04:34 PM
SambaNova has announced the launch of its new cloud inference platform, SambaNova Cloud, which offers unprecedented speeds for AI model inference. Notably, the Llama 3.1 405B model achieves a speed of 132 tokens per second in full precision, while the Llama 3.1 70B model reaches up to 570 tokens per second. This performance is significantly faster than traditional GPUs, with claims of up to 10 times faster inference speeds. The platform operates in real-time and serves the Llama 3.1 405B model in 16-bit precision. It is available for developers starting today, with free access via API and no waitlist. The service has been independently verified and is expected to enable advanced AI applications.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Less than 5% • 25%
5% to 10% • 25%
10.1% to 20% • 25%
More than 20% • 25%
Less than 10,000 • 25%
10,000 to 50,000 • 25%
50,001 to 100,000 • 25%
More than 100,000 • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
Less than 10% • 25%
More than 30% • 25%
20% to 30% • 25%
10% to 20% • 25%
4th or lower • 25%
1st • 25%
2nd • 25%
3rd • 25%