Loading...
Loading...
Browse all stories on DeepNewz
VisitWill xLSTM outperform Transformers in benchmark tests by end of 2024?
Yes • 50%
No • 50%
Results published in a peer-reviewed AI research journal or on a benchmarking platform like Papers with Code
Sepp Hochreiter Unveils xLSTM with New Cells, Competing with Transformers
May 8, 2024, 06:15 AM
AI pioneer Sepp Hochreiter has introduced a new machine learning architecture, the Extended Long Short-Term Memory (xLSTM), which aims to address the limitations of traditional LSTMs by incorporating sLSTM and mLSTM memory cells. The xLSTM features exponential gating, modified memory structures, and a new parallelizable LSTM design, enhancing its performance and scalability. This innovation positions xLSTM as a strong competitor to state-of-the-art Transformers and State Space Models. Hochreiter's team at NXAI is also working on building European LLMs using this new architecture.
View original story
Speed • 33%
Scalability • 33%
Accuracy • 34%
Customer service chatbots • 25%
Automated content generation • 25%
Real-time translation services • 25%
Data analysis and insights • 25%
North America • 25%
Europe • 25%
Asia • 25%
Rest of the World • 25%
Widespread adoption • 33%
Moderate adoption • 33%
Limited adoption • 34%
Finance • 25%
Healthcare • 25%
Automotive • 25%
Telecommunications • 25%
Finance • 25%
Telecommunications • 25%
Automotive • 25%
Healthcare • 25%
30% or more Improvement • 34%
10% Improvement • 33%
20% Improvement • 33%