Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Nemotron-4 340B be adopted by a major financial institution by end of 2024?
Yes • 50%
No • 50%
Public announcements and press releases from major financial institutions
NVIDIA Releases Open-Source Nemotron-4 340B with 340B Parameters for Synthetic Data Generation
Jun 14, 2024, 07:11 PM
NVIDIA has announced the release of Nemotron-4 340B, a family of open-source models designed for generating synthetic data to train large language models (LLMs) for commercial applications. The models, which include Base, Instruct, and Reward variants, are optimized for NVIDIA NeMo and TensorRT-LLM platforms. Nemotron-4 340B boasts 340 billion parameters and was trained on 9 trillion tokens, making it one of the largest models to date. It is designed to help developers in various industries, such as healthcare, finance, manufacturing, and retail, to create synthetic data for training LLMs. The models have received a permissive license, allowing developers to own any derivative and model output. Additionally, Nemotron-4 340B ranks first on the Hugging Face RewardBench leaderboard and surpasses Llama-3-70B in several benchmarks. The model was tested under the codename "June-chatbot" and the Reward models were trained with HelpSteer2 preference data, excelling in Arena-Hard-Auto benchmarks.
View original story
Healthcare • 25%
Finance • 25%
Manufacturing • 25%
Retail • 25%
Healthcare • 25%
Finance • 25%
Manufacturing • 25%
Retail • 25%
Healthcare • 25%
Finance • 25%
Manufacturing • 25%
Retail • 25%
Yes • 50%
No • 50%
GPT-4 • 33%
Claude-2 • 33%
LLaMA-3 • 33%
No • 50%
Yes • 50%
Nemotron-5 • 33%
Nemotron-7 • 33%
Nemotron-6 • 33%
Finance • 25%
Retail • 25%
Manufacturing • 25%
Healthcare • 25%