Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat will be SambaNova Cloud's market share in the AI inference platform market by end of 2024?
Less than 5% • 25%
5% to 10% • 25%
10.1% to 20% • 25%
More than 20% • 25%
Market analysis reports from firms like Gartner, IDC, or Forrester
SambaNova Launches Fastest AI Platform with Record 132 Tokens/Sec for Llama 3.1 405B
Sep 10, 2024, 02:48 PM
SambaNova has launched its new cloud inference platform, SambaNova Cloud, which is now available for developers to access Llama 3.1 models including 8B, 70B, and 405B on their custom AI chips. The platform sets a new record for inference speed, achieving 132 tokens per second for Llama 3.1 405B at full precision and 570 tokens per second for Llama 3.1 70B. This performance is 10 times faster than traditional GPUs. The API is available for free with no waitlist, enabling developers to unlock advanced AI applications. Additionally, Llama 3.1 405B can also achieve 100 tokens per second on TogetherCompute API, with a 128k long-context version coming soon.
View original story
Less than 10% • 25%
10% to 20% • 25%
20% to 30% • 25%
More than 30% • 25%
1st • 25%
2nd • 25%
3rd • 25%
4th or lower • 25%
Increases • 33%
Remains the same • 34%
Decreases • 33%
NVIDIA > 50% • 25%
NVIDIA 30-50% • 25%
NVIDIA 10-30% • 25%
NVIDIA < 10% • 25%
Less than 10% • 25%
10% to 20% • 25%
20% to 30% • 25%
More than 30% • 25%
Above 60% • 33%
50% to 60% • 33%
Below 50% • 34%
Less than 50% • 25%
50% to 60% • 25%
60% to 70% • 25%
More than 70% • 25%
Above 85% • 25%
80%-85% • 25%
75%-80% • 25%
Below 75% • 25%
Less than 5% • 25%
5% to 10% • 25%
10% to 15% • 25%
More than 15% • 25%
Less than 20% • 25%
20% to 30% • 25%
30% to 40% • 25%
More than 40% • 25%
Less than 10% • 25%
10%-20% • 25%
20%-30% • 25%
More than 30% • 25%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
Less than 10,000 • 25%
More than 100,000 • 25%
50,001 to 100,000 • 25%
10,000 to 50,000 • 25%