Loading...
Loading...
Browse all stories on DeepNewz
VisitMeta Introduces 34B Parameter Chameleon Mixed-Modal Models
May 17, 2024, 01:20 AM
Meta has introduced Chameleon, a family of early-fusion token-based mixed-modal models capable of understanding and generating images and text in any arbitrary sequence. The research, conducted by FAIR last year, presents a stable training recipe for these fully token-based multi-modal early-fusion auto-regressive models. Chameleon, a 34B parameter autoregressive model, was trained on approximately 10 trillion tokens. The models are designed for interleaved text and image understanding and generation, showcasing state-of-the-art performance and versatile capabilities.
View original story
Healthcare • 25%
Automotive • 25%
Finance • 25%
Entertainment • 25%
Technology • 25%
Healthcare • 25%
Education • 25%
Retail • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%
Entertainment • 25%
Entertainment • 25%
Politics • 25%
News Media • 25%
Other • 25%
Gaming • 25%
Film/Animation • 25%
Advertising • 25%
Architecture • 25%
Healthcare • 25%
Automotive • 25%
Finance • 25%
Entertainment • 25%
Film and Television • 25%
Video Gaming • 25%
Virtual Reality • 25%
Online Education • 25%
Healthcare • 25%
Finance • 25%
Retail • 25%
Entertainment • 25%
Jina AI • 33%
Nomic AI • 33%
OpenAI • 33%
Entertainment • 25%
Education • 25%
Marketing • 25%
Virtual Reality • 25%
Entertainment • 25%
Advertising • 25%
Education • 25%
Gaming • 25%
Marketing • 25%
Entertainment • 25%
E-commerce • 25%
Healthcare • 25%
Virtual Reality • 25%
Augmented Reality • 25%
Natural Language Processing • 25%
Image Generation • 25%