Loading...
Loading...
Browse all stories on DeepNewz
VisitJapan's CPU-Trained Fugaku-LLM, a 13B Language Model, Integrated into SambaNova
May 13, 2024, 06:28 AM
The newly released Fugaku-LLM, a large language model with 13 billion parameters and a 400-token limit, has been trained using the Fugaku supercomputer, primarily utilizing CPU resources. This initiative marks one of the first major LLM projects in Japan, spearheaded by the founders of Kotoba and led by notable figures such as Rio Yokota. Developed by a collaborative team including top researchers and entities like CyberAgent and Fujitsu, the model is notable for its size and the use of distributed training methods. Additionally, Fugaku-LLM has been integrated into SambaNova Systems' Samba-1 and is being showcased at ISC24 in booth A11, Hall H.
View original story
Markets
Yes • 50%
No • 50%
Published results from reputable tech or AI research journals or conferences
Yes • 50%
No • 50%
Press releases or financial reports from SambaNova Systems
Yes • 50%
No • 50%
Official announcements from tech companies or credible business news outlets
Telecommunications • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%
Industry reports, company announcements, or credible news sources
North America • 25%
Asia • 25%
Europe • 25%
Other regions • 25%
Global tech adoption reports or multinational corporation disclosures
Scalability • 25%
Language versatility • 25%
Performance efficiency • 25%
Integration capabilities • 25%
Expert reviews, tech blogs, and panel discussions at major tech conferences