Loading...
Loading...
Browse all stories on DeepNewz
VisitWill Molmo be used in a published research paper by a top-tier academic conference by June 30, 2025?
Yes • 50%
No • 50%
Published papers in conferences such as NeurIPS, ICML, or CVPR
AI2 Releases Molmo, Open-Source AI Model in 1B, 7B, and 72B Sizes Outperforming Proprietary Systems
Sep 25, 2024, 01:50 PM
The Allen Institute for AI (AI2) has released the Multimodal Open Language Model (Molmo), a state-of-the-art open-source AI model. Molmo is available in 1B, 7B, and 72B-parameter sizes and has been shown to outperform proprietary models such as GPT-4V, Claude 3.5 Sonnet, and Flash. The model's performance is attributed to its focus on data quality over quantity, utilizing a meticulously curated dataset called PixMo. Molmo's capabilities include understanding and acting on multimodal data, enabling rich interactions in both physical and virtual worlds. The release includes four model checkpoints: MolmoE-1B, Molmo-7B-O, and others, making it the most capable open-source AI model to date. Human preference for the 72B model is on par with top API models. The dataset PixMo was curated over 9 months.
View original story
Molmo has fewer citations than GPT-4V • 25%
Molmo has 0-10% more citations than GPT-4V • 25%
Molmo has 10-20% more citations than GPT-4V • 25%
Molmo has more than 20% more citations than GPT-4V • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
Yes • 50%
No • 50%
72B • 25%
Other • 25%
1B • 25%
7B • 25%
GPT-4V • 25%
Other • 25%
Flash • 25%
Claude 3.5 Sonnet • 25%