Loading...
Loading...
Browse all stories on DeepNewz
VisitMeta's Llama 3.3: 70B Model with 128K Context Window Matches 405B Performance
Dec 6, 2024, 05:22 PM
Meta Platforms Inc. has released Llama 3.3, a new 70 billion parameter language model that matches the performance of its previous 405 billion parameter model, Llama 3.1, but at a significantly lower cost. The model, which is text-only, supports eight languages and has a 128,000 token context window, is available under the Llama 3.3 Community License. This release signifies Meta's continued push towards making high-performance AI models more accessible and cost-effective for developers and users. Llama 3.3 outperforms Amazon's Nova Pro and matches the capabilities of Llama 3.1 405B, offering improved efficiency in areas like math and reasoning. It is trained on 15 trillion tokens, with a knowledge cutoff in December 2023, and is available for download on Meta and Hugging Face.
View original story
Markets
Yes • 50%
No • 50%
Download statistics from Hugging Face's official website or announcements
No • 50%
Yes • 50%
Benchmark performance results published by Meta or independent AI benchmarking organizations
No • 50%
Yes • 50%
Official announcements from Meta Platforms Inc.
OpenAI • 25%
Other • 25%
Meta • 25%
Google • 25%
Performance benchmarks published by AI research organizations or companies
Japanese • 25%
Other • 25%
Russian • 25%
Korean • 25%
Official updates from Meta Platforms Inc.
Other • 25%
Healthcare • 25%
Finance • 25%
Education • 25%
Industry reports and adoption case studies published by Meta or industry analysts