Loading...
Loading...
Browse all stories on DeepNewz
VisitDoes Fugaku-LLM outperform GPT-3 by end of 2024?
Yes • 50%
No • 50%
Published results from reputable tech or AI research journals or conferences
Japan's CPU-Trained Fugaku-LLM, a 13B Language Model, Integrated into SambaNova
May 13, 2024, 06:28 AM
The newly released Fugaku-LLM, a large language model with 13 billion parameters and a 400-token limit, has been trained using the Fugaku supercomputer, primarily utilizing CPU resources. This initiative marks one of the first major LLM projects in Japan, spearheaded by the founders of Kotoba and led by notable figures such as Rio Yokota. Developed by a collaborative team including top researchers and entities like CyberAgent and Fujitsu, the model is notable for its size and the use of distributed training methods. Additionally, Fugaku-LLM has been integrated into SambaNova Systems' Samba-1 and is being showcased at ISC24 in booth A11, Hall H.
View original story
Surpasses expectations • 25%
Meets expectations • 25%
Below expectations • 25%
Significantly below expectations • 25%
Yes • 50%
No • 50%
Telecommunications • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%
North America • 25%
Asia • 25%
Europe • 25%
Other regions • 25%