Loading...
Loading...
Browse all stories on DeepNewz
VisitLiquidAI Introduces SOTA Liquid Foundation Models: 1B, 3B, 40B
Sep 30, 2024, 04:20 PM
LiquidAI has introduced a new series of Liquid Foundation Models (LFMs), which include 1B, 3B, and 40B parameter models. These models are built on a custom architecture that is not based on traditional Transformer models. The LFMs are designed to be state-of-the-art (SOTA) in performance, with minimal memory footprint and efficient inference, making them suitable for edge deployments. The models are general-purpose sequence models capable of handling text and audio tasks. Key figures involved in this development include Joscha Bach and Mikhail Parakhin. The LFMs have shown better performance in benchmarks such as MMLU, ARC, and GSM8K compared to traditional models in the same parameter range. The models are built from first principles, resulting in groundbreaking performance.
View original story
Markets
Yes • 50%
No • 50%
Official announcements from LiquidAI or the partnering company
No • 50%
Yes • 50%
Publicly available ARC benchmark results
No • 50%
Yes • 50%
Publicly available MMLU benchmark results
MMLU • 25%
None by June 30, 2024 • 25%
GSM8K • 25%
ARC • 25%
Publicly available benchmark results
3B model • 25%
None by end of 2024 • 25%
1B model • 25%
40B model • 25%
Official announcements from the Fortune 500 company or LiquidAI
Other • 25%
TechCrunch • 25%
Wired • 25%
MIT Technology Review • 25%
Publication of the review in the major publication