Loading...
Loading...
Browse all stories on DeepNewz
VisitMeta Introduces 34B Parameter Chameleon Mixed-Modal Models
May 17, 2024, 01:20 AM
Meta has introduced Chameleon, a family of early-fusion token-based mixed-modal models capable of understanding and generating images and text in any arbitrary sequence. The research, conducted by FAIR last year, presents a stable training recipe for these fully token-based multi-modal early-fusion auto-regressive models. Chameleon, a 34B parameter autoregressive model, was trained on approximately 10 trillion tokens. The models are designed for interleaved text and image understanding and generation, showcasing state-of-the-art performance and versatile capabilities.
View original story
Apple • 20%
Google • 20%
Microsoft • 20%
IBM • 20%
None • 20%
Google • 20%
Microsoft • 20%
IBM • 20%
Facebook • 20%
Amazon • 20%
Microsoft • 25%
Amazon • 25%
Meta • 25%
IBM • 25%
Google • 20%
Amazon • 20%
Facebook • 20%
Apple • 20%
Microsoft • 20%
Google • 25%
Amazon • 25%
Apple • 25%
Microsoft • 25%
Google • 20%
Microsoft • 20%
Amazon • 20%
Samsung • 20%
None • 20%
Google • 25%
Microsoft • 25%
IBM • 25%
Amazon • 25%
Microsoft • 25%
Amazon • 25%
Google • 25%
Meta • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
IBM • 25%
Google • 20%
Amazon • 20%
Microsoft • 20%
Meta • 20%
Apple • 20%
Finance • 25%
Healthcare • 25%
Automotive • 25%
Entertainment • 25%