Loading...
Loading...
Browse all stories on DeepNewz
VisitWhich platform will have the highest number of downloads for Pixtral 12B by December 31, 2024?
GitHub • 33%
Hugging Face • 33%
Torrent • 33%
Other • 33%
Download statistics from respective platforms (GitHub, Hugging Face, Torrent)
Mistral AI Releases 25.38 GB Pixtral 12B, Its First 12-Billion Parameter Multimodal Model
Sep 11, 2024, 08:15 AM
Mistral AI has released its first multimodal model, Pixtral 12B, which integrates both language and vision processing capabilities. The model, which is approximately 25.38 GB in size, features a 12-billion parameter architecture with 40 layers and a hidden dimension of 14,336. Key specifications include a text backbone based on Mistral Nemo 12B, a vision adapter with 400 million parameters, and a larger vocabulary of 131,072 tokens. The vision encoder uses GeLU and 2D RoPE, and the model introduces three new special tokens. Pixtral 12B is available via torrent and has been uploaded to platforms like GitHub and Hugging Face. This release marks a significant advancement in multimodal AI technology.
View original story
Yes • 50%
No • 50%
Instagram • 25%
Messenger • 25%
WhatsApp • 25%
Web • 25%
Yes • 50%
No • 50%
Instagram • 25%
WhatsApp • 25%
Messenger • 25%
Other • 25%
Facebook • 25%
Instagram • 25%
Messenger • 25%
Other • 25%
Azure AI Foundry • 25%
Hugging Face • 25%
GitHub • 25%
Other • 25%
Phantom • 20%
Jupiter • 20%
Drift • 20%
Helius • 20%
Tensor • 20%
Steam • 25%
Epic Games Store • 25%
Direct from DecartAI/Etched • 25%
Other • 25%
Google Search • 25%
Google Lens • 25%
Circle to Search • 25%
Other • 25%
iPad • 25%
PC • 25%
Mac • 25%
Other • 25%
Yes • 50%
No • 50%
Snap • 25%
TikTok • 25%
Meta • 25%
Other • 25%
Yes • 50%
No • 50%
Integration of Language and Vision • 25%
Language Processing • 25%
Ease of Use • 25%
Vision Processing • 25%