Loading...
Loading...
Browse all stories on DeepNewz
VisitNVIDIA Unveils Nemotron-4 340B with 340B Parameters, Tops HuggingFace RewardBench
Jun 14, 2024, 04:23 PM
NVIDIA has announced the release of Nemotron-4 340B, a family of open models designed to generate synthetic data for training large language models (LLMs) for commercial applications. The models can be used across various sectors including healthcare, finance, manufacturing, and retail. Nemotron-4 340B, which features 340 billion parameters, matches the performance of OpenAI's GPT-4 for chat applications and synthetic data generation. This dense LLM includes 4k parameters, and NVIDIA does not claim ownership of any outputs generated. Furthermore, it ranks first on the HuggingFace RewardBench leaderboard and is optimized for generative AI training.
View original story
Markets
Yes • 50%
No • 50%
Official announcements from major healthcare providers or NVIDIA press releases
Yes • 50%
No • 50%
HuggingFace RewardBench leaderboard updates
Yes • 50%
No • 50%
NVIDIA press releases and official announcements
Retail • 25%
Healthcare • 25%
Finance • 25%
Manufacturing • 25%
NVIDIA's official announcements and industry reports
Healthcare • 25%
Finance • 25%
Manufacturing • 25%
Retail • 25%
Industry reports and NVIDIA's official announcements
Claude-2 • 33%
LLaMA-3 • 33%
GPT-4 • 33%
HuggingFace RewardBench leaderboard updates