Loading...
Loading...
Browse all stories on DeepNewz
VisitSepp Hochreiter Unveils xLSTM with New Cells, Competing with Transformers
May 8, 2024, 06:15 AM
AI pioneer Sepp Hochreiter has introduced a new machine learning architecture, the Extended Long Short-Term Memory (xLSTM), which aims to address the limitations of traditional LSTMs by incorporating sLSTM and mLSTM memory cells. The xLSTM features exponential gating, modified memory structures, and a new parallelizable LSTM design, enhancing its performance and scalability. This innovation positions xLSTM as a strong competitor to state-of-the-art Transformers and State Space Models. Hochreiter's team at NXAI is also working on building European LLMs using this new architecture.
View original story
Markets
Yes • 50%
No • 50%
Public announcements or earnings calls from major tech companies
No • 50%
Yes • 50%
Press releases or official announcements from NXAI
No • 50%
Yes • 50%
Results published in a peer-reviewed AI research journal or on a benchmarking platform like Papers with Code
Finance • 25%
Telecommunications • 25%
Automotive • 25%
Healthcare • 25%
Industry reports and news articles on xLSTM implementations
30% or more Improvement • 34%
10% Improvement • 33%
20% Improvement • 33%
Research papers and benchmark results
Predictive Analytics • 34%
Natural Language Processing • 33%
Computer Vision • 33%
Comparative studies published in AI research journals