Loading...
Loading...
Browse all stories on DeepNewz
VisitPerformance improvement of xLSTM over standard LSTM by 2024
10% Improvement • 33%
20% Improvement • 33%
30% or more Improvement • 34%
Research papers and benchmark results
Sepp Hochreiter Unveils xLSTM with New Cells, Competing with Transformers
May 8, 2024, 06:15 AM
AI pioneer Sepp Hochreiter has introduced a new machine learning architecture, the Extended Long Short-Term Memory (xLSTM), which aims to address the limitations of traditional LSTMs by incorporating sLSTM and mLSTM memory cells. The xLSTM features exponential gating, modified memory structures, and a new parallelizable LSTM design, enhancing its performance and scalability. This innovation positions xLSTM as a strong competitor to state-of-the-art Transformers and State Space Models. Hochreiter's team at NXAI is also working on building European LLMs using this new architecture.
View original story
Speed • 33%
Scalability • 33%
Accuracy • 34%
North America • 25%
Europe • 25%
Asia • 25%
Rest of the World • 25%
Widespread adoption • 33%
Moderate adoption • 33%
Limited adoption • 34%
Customer service chatbots • 25%
Automated content generation • 25%
Real-time translation services • 25%
Data analysis and insights • 25%
Less than 20% improvement • 25%
20-40% improvement • 25%
40-60% improvement • 25%
More than 60% improvement • 25%
0-10% • 25%
11-20% • 25%
21-30% • 25%
Above 30% • 25%
Finance • 25%
Telecommunications • 25%
Automotive • 25%
Healthcare • 25%