Loading...
Loading...
Browse all stories on DeepNewz
VisitWill LiquidAI announce a partnership with a major tech company for edge deployment by March 2025?
Yes • 50%
No • 50%
Official press release from LiquidAI or the partnering tech company
MIT Spinoff LiquidAI Debuts State-of-the-Art Non-Transformer AI Models with Context Length Scaling
Sep 30, 2024, 05:14 PM
MIT spinoff LiquidAI has introduced Liquid Foundation Models (LFMs), a new series of language models that represent a significant advancement in AI technology. These models, which include a 1B, 3B, and 40B parameter model, are notable for their state-of-the-art performance despite not being based on the traditional Transformer architecture. The LFMs are designed to have minimal memory footprints, making them suitable for edge deployments. The research team, which includes notable figures such as Joscha Bach and Mikhail Parakhin, has reimagined AI from first principles, resulting in groundbreaking models that outperform traditional architectures. LiquidAI's innovative approach draws inspiration from biological systems, aiming to rethink all parts of the AI pipeline from architecture design to post-training. The models also feature context length scaling and achieve SOTA performance without using GPT.
View original story
Yes • 50%
No • 50%
Apple • 25%
Google • 25%
Microsoft • 25%
Other • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
NeurIPS • 25%
Other • 25%
CVPR • 25%
ICML • 25%
None of the models achieve the highest score • 25%
1B parameter model • 25%
3B parameter model • 25%
40B parameter model • 25%