Loading...
Loading...
Browse all stories on DeepNewz
VisitWhich major publication will first feature a detailed review of LiquidAI's LFMs by March 31, 2024?
TechCrunch • 25%
Wired • 25%
MIT Technology Review • 25%
Other • 25%
Publication of the review in the major publication
LiquidAI Introduces SOTA Liquid Foundation Models: 1B, 3B, 40B
Sep 30, 2024, 04:20 PM
LiquidAI has introduced a new series of Liquid Foundation Models (LFMs), which include 1B, 3B, and 40B parameter models. These models are built on a custom architecture that is not based on traditional Transformer models. The LFMs are designed to be state-of-the-art (SOTA) in performance, with minimal memory footprint and efficient inference, making them suitable for edge deployments. The models are general-purpose sequence models capable of handling text and audio tasks. Key figures involved in this development include Joscha Bach and Mikhail Parakhin. The LFMs have shown better performance in benchmarks such as MMLU, ARC, and GSM8K compared to traditional models in the same parameter range. The models are built from first principles, resulting in groundbreaking performance.
View original story
NeurIPS • 25%
ICML • 25%
CVPR • 25%
Other • 25%
Consumer device • 25%
Enterprise software • 25%
AI development tool • 25%
Other • 25%
TechCrunch • 25%
Wired • 25%
The Verge • 25%
Other • 25%
Yes • 50%
No • 50%
Wired • 25%
The Verge • 25%
TechCrunch • 25%
Other • 25%
Google • 25%
Microsoft • 25%
Amazon • 25%
Other • 25%
TechCrunch • 25%
Wired • 25%
The Verge • 25%
Other • 25%
The New York Times • 25%
The Guardian • 25%
Forbes • 25%
Other • 25%
TechCrunch • 25%
Wired • 25%
The Verge • 25%
Other • 25%
Apple • 25%
Google • 25%
Microsoft • 25%
Other • 25%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
MMLU • 25%
None by June 30, 2024 • 25%
GSM8K • 25%
ARC • 25%