Loading...
Loading...
Browse all stories on DeepNewz
VisitMajor non-Japanese tech company adoption of Fugaku-LLM within a year
Yes • 50%
No • 50%
Official announcements from tech companies or credible business news outlets
Japan's CPU-Trained Fugaku-LLM, a 13B Language Model, Integrated into SambaNova
May 13, 2024, 06:28 AM
The newly released Fugaku-LLM, a large language model with 13 billion parameters and a 400-token limit, has been trained using the Fugaku supercomputer, primarily utilizing CPU resources. This initiative marks one of the first major LLM projects in Japan, spearheaded by the founders of Kotoba and led by notable figures such as Rio Yokota. Developed by a collaborative team including top researchers and entities like CyberAgent and Fujitsu, the model is notable for its size and the use of distributed training methods. Additionally, Fugaku-LLM has been integrated into SambaNova Systems' Samba-1 and is being showcased at ISC24 in booth A11, Hall H.
View original story
Healthcare • 25%
Automotive • 25%
Finance • 25%
Entertainment • 25%
Google • 25%
Amazon • 25%
Microsoft • 25%
IBM • 25%
Launch of a competing model • 50%
Collaboration with Fugaku-LLM's team • 25%
No significant response • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Meta • 25%
Yes • 50%
No • 50%
Llama8B • 25%
Qwen 2 • 25%
Nemotron • 25%
Other • 25%
Google • 25%
Amazon • 25%
Apple • 25%
Microsoft • 25%
Telecommunications • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%
North America • 25%
Asia • 25%
Europe • 25%
Other regions • 25%