Loading...
Loading...
Browse all stories on DeepNewz
VisitHow many times will AI2's Molmo models be downloaded on Hugging Face by Q1 2025?
Less than 10,000 • 25%
10,000 to 50,000 • 25%
50,001 to 100,000 • 25%
More than 100,000 • 25%
Hugging Face download statistics
AI2 Launches Molmo: Open-Source AI Models in 1B, 7B, 72B Sizes, Using 1000x Less Data
Sep 25, 2024, 03:42 PM
The Allen Institute for AI (AI2) has launched Molmo, a family of open-source multimodal AI models. These models, available in 1B, 7B, and 72B-parameter sizes, are designed to outperform proprietary systems such as GPT-4V, Claude 3.5 Sonnet, and Gemini 1.5 Pro. Molmo models excel in vision and language tasks, leveraging a novel dataset called PixMo, which includes high-quality image-caption pairs and multimodal instruction data. The models are capable of rich interactions in both physical and virtual worlds, using 1000x less data compared to their closed-source counterparts. AI2's Molmo aims to democratize access to advanced AI capabilities by providing open weights and allowing researchers and developers to build upon them. Molmo also shows impressive performance on RealworldQA and OOD robotics perception tasks.
View original story
Yes • 50%
No • 50%
400 million • 25%
450 million • 25%
500 million • 25%
More than 500 million • 25%
350-400 million • 25%
400-450 million • 25%
450-500 million • 25%
Over 500 million • 25%
Nemotron 70B • 25%
ChatGPT4o • 25%
Sonnet 3.5 • 25%
Other • 25%
Less than 10% • 25%
10% to 20% • 25%
20% to 30% • 25%
More than 30% • 25%
Yes • 50%
No • 50%
Under 10,000 • 25%
10,000 to 50,000 • 25%
50,001 to 100,000 • 25%
Over 100,000 • 25%
Less than 500 billion • 25%
500 billion to 700 billion • 25%
700 billion to 900 billion • 25%
More than 900 billion • 25%
Less than 10,000 • 25%
10,000 to 50,000 • 25%
50,001 to 100,000 • 25%
More than 100,000 • 25%
No • 50%
Yes • 50%
Yes • 50%
No • 50%
Top 5 • 25%
Top 1 • 25%
Below Top 5 • 25%
Top 3 • 25%