Loading...
Loading...
Browse all stories on DeepNewz
VisitWill SambaNova Cloud be integrated into a major cloud provider's platform by mid-2024?
Yes • 50%
No • 50%
Official announcements from SambaNova and major cloud providers (e.g., AWS, Google Cloud, Microsoft Azure)
SambaNova Launches Fastest AI Platform with Record 132 Tokens/Sec for Llama 3.1 405B
Sep 10, 2024, 02:48 PM
SambaNova has launched its new cloud inference platform, SambaNova Cloud, which is now available for developers to access Llama 3.1 models including 8B, 70B, and 405B on their custom AI chips. The platform sets a new record for inference speed, achieving 132 tokens per second for Llama 3.1 405B at full precision and 570 tokens per second for Llama 3.1 70B. This performance is 10 times faster than traditional GPUs. The API is available for free with no waitlist, enabling developers to unlock advanced AI applications. Additionally, Llama 3.1 405B can also achieve 100 tokens per second on TogetherCompute API, with a 128k long-context version coming soon.
View original story
Less than 10% • 25%
10% to 20% • 25%
20% to 30% • 25%
More than 30% • 25%
Yes • 50%
No • 50%
1st • 25%
2nd • 25%
3rd • 25%
4th or lower • 25%
Amazon Web Services • 25%
Microsoft Azure • 25%
Google Cloud • 25%
Other • 25%
Yes • 50%
No • 50%
AWS • 25%
Google Cloud • 25%
Microsoft Azure • 25%
Other • 25%
Yes • 50%
No • 50%
Amazon Web Services (AWS) • 25%
Google Cloud Platform (GCP) • 25%
Microsoft Azure • 25%
Other • 25%
Yes • 50%
No • 50%
Less than 10,000 • 25%
More than 100,000 • 25%
50,001 to 100,000 • 25%
10,000 to 50,000 • 25%
More than 20% • 25%
Less than 5% • 25%
5% to 10% • 25%
10.1% to 20% • 25%