Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Fugaku-LLM outperform a leading Western AI model by 2024?
Yes • 50%
No • 50%
Results published by an independent testing organization or academic paper
Japan's Fugaku-LLM AI Model: 13B Parameters, 400 Tokens
May 11, 2024, 05:24 AM
A significant advancement in artificial intelligence has been achieved with the release of Fugaku-LLM, a large language model developed by a team of Japanese researchers. The model, which consists of 13 billion parameters and handles up to 400 tokens, was trained on the Fugaku supercomputer, leveraging its CPU capabilities. This project marks one of the earliest large language model initiatives in Japan, spearheaded by the founders of Kotoba and led by notable researcher Rio Yokota. The collaborative effort involved top researchers and utilized distributed training methods.
View original story
Asia • 25%
Europe • 25%
North America • 25%
Other regions • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%
Telecommunications • 25%
Falcon 2 • 33%
Meta's Llama 3 • 33%
OpenAI's latest model • 34%
Yes • 50%
No • 50%
No significant response • 25%
Collaboration with Fugaku-LLM's team • 25%
Launch of a competing model • 50%
Token handling capacity • 33%
Energy efficiency • 34%
Processing speed • 33%