Loading...
Loading...
Browse all stories on DeepNewz
VisitFurther integration of Fugaku-LLM in SambaNova's product line by 2024?
Yes • 50%
No • 50%
Press releases or financial reports from SambaNova Systems
Japan's CPU-Trained Fugaku-LLM, a 13B Language Model, Integrated into SambaNova
May 13, 2024, 06:28 AM
The newly released Fugaku-LLM, a large language model with 13 billion parameters and a 400-token limit, has been trained using the Fugaku supercomputer, primarily utilizing CPU resources. This initiative marks one of the first major LLM projects in Japan, spearheaded by the founders of Kotoba and led by notable figures such as Rio Yokota. Developed by a collaborative team including top researchers and entities like CyberAgent and Fujitsu, the model is notable for its size and the use of distributed training methods. Additionally, Fugaku-LLM has been integrated into SambaNova Systems' Samba-1 and is being showcased at ISC24 in booth A11, Hall H.
View original story
Token handling capacity • 33%
Processing speed • 33%
Energy efficiency • 34%
Yes • 50%
No • 50%
Launch of a competing model • 50%
Collaboration with Fugaku-LLM's team • 25%
No significant response • 25%
Healthcare • 25%
Automotive • 25%
Finance • 25%
Entertainment • 25%
Yes • 50%
No • 50%
Primarily in tech and data centers • 25%
Expansion into healthcare and pharmaceuticals • 25%
Growth in automotive and manufacturing • 25%
Limited to specialized AI research fields • 25%
Yes • 50%
No • 50%
Leader in AI chip market • 33%
Top 3 in AI chip market • 33%
Remains a niche player • 34%
0-1 new partnerships • 25%
2-5 new partnerships • 25%
6-10 new partnerships • 25%
More than 10 new partnerships • 25%
Telecommunications • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%
North America • 25%
Asia • 25%
Europe • 25%
Other regions • 25%