Loading...
Loading...
Browse all stories on DeepNewz
VisitJapan's Fugaku-LLM AI Model: 13B Parameters, 400 Tokens
May 11, 2024, 05:24 AM
A significant advancement in artificial intelligence has been achieved with the release of Fugaku-LLM, a large language model developed by a team of Japanese researchers. The model, which consists of 13 billion parameters and handles up to 400 tokens, was trained on the Fugaku supercomputer, leveraging its CPU capabilities. This project marks one of the earliest large language model initiatives in Japan, spearheaded by the founders of Kotoba and led by notable researcher Rio Yokota. The collaborative effort involved top researchers and utilized distributed training methods.
View original story
Markets
Yes • 50%
No • 50%
Official announcements from the developers or peer-reviewed scientific publications
No • 50%
Yes • 50%
Press releases from major tech companies or credible tech news outlets
No • 50%
Yes • 50%
Results published by an independent testing organization or academic paper
No significant response • 25%
Collaboration with Fugaku-LLM's team • 25%
Launch of a competing model • 50%
Market analysis reports or press statements from competing organizations
Token handling capacity • 33%
Energy efficiency • 34%
Processing speed • 33%
Technical documentation or official announcements from the development team
Finance • 25%
Healthcare • 25%
Automotive • 25%
Entertainment • 25%
Industry reports or press releases from adopting entities