Loading...
Loading...
Browse all stories on DeepNewz
VisitAI2 Releases Molmo, Open-Source AI Model in 1B, 7B, and 72B Sizes Outperforming Proprietary Systems
Sep 25, 2024, 01:50 PM
The Allen Institute for AI (AI2) has released the Multimodal Open Language Model (Molmo), a state-of-the-art open-source AI model. Molmo is available in 1B, 7B, and 72B-parameter sizes and has been shown to outperform proprietary models such as GPT-4V, Claude 3.5 Sonnet, and Flash. The model's performance is attributed to its focus on data quality over quantity, utilizing a meticulously curated dataset called PixMo. Molmo's capabilities include understanding and acting on multimodal data, enabling rich interactions in both physical and virtual worlds. The release includes four model checkpoints: MolmoE-1B, Molmo-7B-O, and others, making it the most capable open-source AI model to date. Human preference for the 72B model is on par with top API models. The dataset PixMo was curated over 9 months.
View original story
Markets
No • 50%
Yes • 50%
Press releases or official announcements from major tech companies
No • 50%
Yes • 50%
Published papers in conferences such as NeurIPS, ICML, or CVPR
Yes • 50%
No • 50%
Results from a publicly available and reputable AI benchmark test
72B • 25%
Other • 25%
1B • 25%
7B • 25%
Download statistics from the official AI2 repository or other official download sources
GPT-4V • 25%
Other • 25%
Flash • 25%
Claude 3.5 Sonnet • 25%
Results from publicly available AI benchmark tests
Other • 25%
Healthcare • 25%
Finance • 25%
Retail • 25%
Press releases or official announcements from companies in various sectors