Loading...
Loading...
Browse all stories on DeepNewz
VisitWill AI2's Molmo 72B model achieve state-of-the-art performance on OOD robotics perception tasks by Q1 2025?
Yes • 50%
No • 50%
Research papers, official benchmark results, or statements from AI research organizations
AI2 Launches Molmo: Open-Source AI Models in 1B, 7B, 72B Sizes, Using 1000x Less Data
Sep 25, 2024, 03:42 PM
The Allen Institute for AI (AI2) has launched Molmo, a family of open-source multimodal AI models. These models, available in 1B, 7B, and 72B-parameter sizes, are designed to outperform proprietary systems such as GPT-4V, Claude 3.5 Sonnet, and Gemini 1.5 Pro. Molmo models excel in vision and language tasks, leveraging a novel dataset called PixMo, which includes high-quality image-caption pairs and multimodal instruction data. The models are capable of rich interactions in both physical and virtual worlds, using 1000x less data compared to their closed-source counterparts. AI2's Molmo aims to democratize access to advanced AI capabilities by providing open weights and allowing researchers and developers to build upon them. Molmo also shows impressive performance on RealworldQA and OOD robotics perception tasks.
View original story
Ranked 1st • 25%
Ranked 2nd • 25%
Ranked 3rd to 5th • 25%
Ranked below 5th • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
GPT-5 • 25%
BERT-3 • 25%
Claude 3.0 • 25%
Other • 25%
Yes • 50%
No • 50%
Claude 3.5 Sonnet • 33%
GPT-4o • 33%
Google's AI Model • 33%
Yes • 50%
No • 50%
FLUX 1.1 Pro • 25%
FLUX 1.0 Pro • 25%
Competitor Model 1 • 25%
Competitor Model 2 • 25%
Yes • 50%
No • 50%
50,001 to 100,000 • 25%
More than 100,000 • 25%
Less than 10,000 • 25%
10,000 to 50,000 • 25%
Top 5 • 25%
Top 1 • 25%
Below Top 5 • 25%
Top 3 • 25%