Loading...
Loading...
Browse all stories on DeepNewz
VisitMeta Unveils Llama 3.3: 70B Parameter Model Outperforms Competitors
Dec 6, 2024, 05:31 PM
Meta Platforms Inc. has released Llama 3.3, a new open-source language model with 70 billion parameters, which matches the performance of its larger predecessor, Llama 3.1, with 405 billion parameters. This model offers significant improvements in efficiency and cost-effectiveness, making it suitable for text-based applications like synthetic data generation. Llama 3.3 supports eight languages, has a 128,000 token context window, and was trained on over 15 trillion tokens. It outperforms several competitors including Google's Gemini 1.5 Pro, OpenAI's GPT-4o, and Amazon's Nova Pro in various benchmarks, with scores of 50.5% on GPQA Diamond (CoT) and 77.0% on Math (CoT). The model is now available for developers on platforms like Meta, Hugging Face, and Groq, with partnerships extending its reach to services like Fireworks AI and Databricks.
View original story
Markets
Yes • 50%
No • 50%
Announcements from major consumer application companies or Meta
No • 50%
Yes • 50%
Developer adoption statistics from platforms like Hugging Face, Meta developer reports
No • 50%
Yes • 50%
Official announcements from Meta Platforms Inc.
Other • 25%
Natural Language Processing • 25%
Synthetic Data Generation • 25%
Multilingual Applications • 25%
Industry reports and usage statistics from platforms like Meta and Hugging Face
Math (CoT) • 25%
None • 25%
GPQA Diamond (CoT) • 25%
Other AI Benchmark • 25%
Benchmark results published by reputable AI performance benchmarking organizations
No New Partners • 25%
Fireworks AI • 25%
Databricks • 25%
New Partner • 25%
Official announcements from Meta Platforms Inc. and its partners