Loading...
Loading...
Browse all stories on DeepNewz
VisitMIT Spinoff LiquidAI Debuts State-of-the-Art Non-Transformer AI Models with Context Length Scaling
Sep 30, 2024, 05:14 PM
MIT spinoff LiquidAI has introduced Liquid Foundation Models (LFMs), a new series of language models that represent a significant advancement in AI technology. These models, which include a 1B, 3B, and 40B parameter model, are notable for their state-of-the-art performance despite not being based on the traditional Transformer architecture. The LFMs are designed to have minimal memory footprints, making them suitable for edge deployments. The research team, which includes notable figures such as Joscha Bach and Mikhail Parakhin, has reimagined AI from first principles, resulting in groundbreaking models that outperform traditional architectures. LiquidAI's innovative approach draws inspiration from biological systems, aiming to rethink all parts of the AI pipeline from architecture design to post-training. The models also feature context length scaling and achieve SOTA performance without using GPT.
View original story
Markets
Yes • 50%
No • 50%
NeurIPS 2024 conference schedule or official presentation videos
Yes • 50%
No • 50%
Official press release from LiquidAI or the partnering tech company
No • 50%
Yes • 50%
Official SuperGLUE benchmark results published on the SuperGLUE leaderboard
NeurIPS • 25%
Other • 25%
CVPR • 25%
ICML • 25%
Official conference schedules or presentation videos
None of the models achieve the highest score • 25%
1B parameter model • 25%
3B parameter model • 25%
40B parameter model • 25%
Official SuperGLUE benchmark results published on the SuperGLUE leaderboard
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
Official press release from LiquidAI or the partnering tech company