Loading...
Loading...
Browse all stories on DeepNewz
VisitSambaNova Launches Fastest AI Platform with Record 132 Tokens/Sec for Llama 3.1 405B
Sep 10, 2024, 02:48 PM
SambaNova has launched its new cloud inference platform, SambaNova Cloud, which is now available for developers to access Llama 3.1 models including 8B, 70B, and 405B on their custom AI chips. The platform sets a new record for inference speed, achieving 132 tokens per second for Llama 3.1 405B at full precision and 570 tokens per second for Llama 3.1 70B. This performance is 10 times faster than traditional GPUs. The API is available for free with no waitlist, enabling developers to unlock advanced AI applications. Additionally, Llama 3.1 405B can also achieve 100 tokens per second on TogetherCompute API, with a 128k long-context version coming soon.
View original story
Markets
Yes • 50%
No • 50%
SambaNova official announcements, benchmark reports
No • 50%
Yes • 50%
Official announcements from SambaNova
No • 50%
Yes • 50%
Official announcements from SambaNova and major cloud providers (e.g., AWS, Google Cloud, Microsoft Azure)
Less than 10,000 • 25%
More than 100,000 • 25%
50,001 to 100,000 • 25%
10,000 to 50,000 • 25%
SambaNova official reports, developer community statistics
More than 20% • 25%
Less than 5% • 25%
5% to 10% • 25%
10.1% to 20% • 25%
Market analysis reports from firms like Gartner, IDC, or Forrester
Llama 3.1 8B • 25%
Llama 3.1 70B • 25%
Llama 3.1 405B • 25%
Other • 25%
SambaNova usage statistics