Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Fugaku-LLM achieve new efficiency milestone by end of 2024?
Yes • 50%
No • 50%
Official announcements from the developers or peer-reviewed scientific publications
Japan's Fugaku-LLM AI Model: 13B Parameters, 400 Tokens
May 11, 2024, 05:24 AM
A significant advancement in artificial intelligence has been achieved with the release of Fugaku-LLM, a large language model developed by a team of Japanese researchers. The model, which consists of 13 billion parameters and handles up to 400 tokens, was trained on the Fugaku supercomputer, leveraging its CPU capabilities. This project marks one of the earliest large language model initiatives in Japan, spearheaded by the founders of Kotoba and led by notable researcher Rio Yokota. The collaborative effort involved top researchers and utilized distributed training methods.
View original story
Healthcare • 25%
Finance • 25%
Automotive • 25%
Telecommunications • 25%
Asia • 25%
Europe • 25%
North America • 25%
Other regions • 25%
Performance efficiency • 25%
Scalability • 25%
Language versatility • 25%
Integration capabilities • 25%
10-20% • 25%
21-50% • 25%
51-100% • 25%
More than 100% • 25%
No significant response • 25%
Collaboration with Fugaku-LLM's team • 25%
Launch of a competing model • 50%
Token handling capacity • 33%
Energy efficiency • 34%
Processing speed • 33%