Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat will be Molmo's ranking in the 2024 AI Model Performance Leaderboard?
Ranked 1st • 25%
Ranked 2nd • 25%
Ranked 3rd to 5th • 25%
Ranked below 5th • 25%
AI Model Performance Leaderboard publications from recognized AI research institutions
Allen Institute for AI Releases State-of-the-Art Molmo Model with 72B Parameters, Surpassing GPT-4V
Sep 25, 2024, 01:50 PM
The Allen Institute for AI has released the Multimodal Open Language Model (Molmo), a state-of-the-art multimodal vision language model. Molmo is available in multiple sizes, including 1B, 7B, and 72B parameters, and is designed to surpass existing models like GPT-4V and Claude 3.5 Sonnet. The model includes four checkpoints: MolmoE-1B, a mixture of experts model with 1B active parameters and 7B total parameters, and Molmo-7B-O, the most open 7B model. Molmo's performance benchmarks above GPT-4V and Flash, and it achieves human preference scores on par with top API models. Additionally, Molmo utilizes the PixMo dataset for high-quality captioning. The model is supported by platforms like hyperbolic_labs and MistralAI.
View original story
Top 1 • 25%
Top 5 • 25%
Top 10 • 25%
Below Top 10 • 25%
Top 1 • 25%
Top 3 • 25%
Top 5 • 25%
Below Top 5 • 25%
Yes • 50%
No • 50%
Top 10 • 25%
Top 20 • 25%
Top 50 • 25%
Below Top 50 • 25%
GPT-4V • 25%
Claude 3.5 Sonnet • 25%
Flash • 25%
Other • 25%
Top 1 • 25%
Top 2 to 3 • 25%
Top 4 to 5 • 25%
Below top 5 • 25%
Yes • 50%
No • 50%
Less than 10,000 • 25%
10,000 to 50,000 • 25%
50,001 to 100,000 • 25%
More than 100,000 • 25%
Rank 1 • 25%
Rank 2 • 25%
Rank 3 • 25%
Rank 4 or lower • 25%
Yes • 50%
No • 50%
Top 3 • 25%
Top 5 • 25%
Top 10 • 25%
Outside Top 10 • 25%
Molmo has fewer citations than GPT-4V • 25%
Molmo has more than 20% more citations than GPT-4V • 25%
Molmo has 10-20% more citations than GPT-4V • 25%
Molmo has 0-10% more citations than GPT-4V • 25%