Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat will be the ranking of AI2's Molmo models on the SuperGLUE leaderboard by end of 2024?
Top 1 • 25%
Top 5 • 25%
Top 10 • 25%
Below Top 10 • 25%
SuperGLUE leaderboard
AI2 Launches Molmo: Open-Source AI Models in 1B, 7B, 72B Sizes, Using 1000x Less Data
Sep 25, 2024, 03:42 PM
The Allen Institute for AI (AI2) has launched Molmo, a family of open-source multimodal AI models. These models, available in 1B, 7B, and 72B-parameter sizes, are designed to outperform proprietary systems such as GPT-4V, Claude 3.5 Sonnet, and Gemini 1.5 Pro. Molmo models excel in vision and language tasks, leveraging a novel dataset called PixMo, which includes high-quality image-caption pairs and multimodal instruction data. The models are capable of rich interactions in both physical and virtual worlds, using 1000x less data compared to their closed-source counterparts. AI2's Molmo aims to democratize access to advanced AI capabilities by providing open weights and allowing researchers and developers to build upon them. Molmo also shows impressive performance on RealworldQA and OOD robotics perception tasks.
View original story
Ranked 1st • 25%
Ranked 2nd • 25%
Ranked 3rd to 5th • 25%
Ranked below 5th • 25%
1B parameter model • 25%
3B parameter model • 25%
40B parameter model • 25%
None of the models achieve the highest score • 25%
1st place • 25%
2nd place • 25%
3rd place • 25%
4th place or lower • 25%
Rank 1 • 25%
Rank 2 • 25%
Rank 3 • 25%
Rank 4 or lower • 25%
Top 10% • 25%
Top 25% • 25%
Top 50% • 25%
Below 50% • 25%
Top 1 • 25%
Top 2 to 3 • 25%
Top 4 to 5 • 25%
Below top 5 • 25%
Top 10 • 25%
Top 20 • 25%
Top 50 • 25%
Below Top 50 • 25%
Top 3 • 25%
Top 5 • 25%
Top 10 • 25%
Outside Top 10 • 25%
Llama 3.1 405B • 25%
GPT-4o • 25%
Claude Sonnet 3.5 • 25%
Other • 25%
No • 50%
Yes • 50%
Yes • 50%
No • 50%
50,001 to 100,000 • 25%
More than 100,000 • 25%
Less than 10,000 • 25%
10,000 to 50,000 • 25%