Loading...
Loading...
Browse all stories on DeepNewz
VisitWill xLSTM outperform leading Transformers in NLP benchmarks by mid-2025?
Yes • 50%
No • 50%
Published results in a peer-reviewed AI journal or major AI conference
Sepp Hochreiter's xLSTM: A New Rival to Transformers Under NXAI Initiative
May 8, 2024, 06:26 AM
AI pioneer Sepp Hochreiter has introduced a new architecture called xLSTM, or Extended Long Short-Term Memory, which aims to address the limitations of traditional LSTMs and compete with state-of-the-art language models like Transformers. The xLSTM incorporates innovations such as exponential gating, modified memory structures, and introduces sLSTM and mLSTM memory cells, enhancing its performance and scalability. This development is part of a broader effort to advance European language model capabilities under the NXAI initiative. The new model, which includes a parallelizable LSTM, has generated significant interest in the AI community, with discussions around its potential to outperform existing models. However, no code or weights have been shared yet.
View original story
Speed • 33%
Scalability • 33%
Accuracy • 34%
10% Improvement • 33%
20% Improvement • 33%
30% or more Improvement • 34%
Widespread adoption • 33%
Moderate adoption • 33%
Limited adoption • 34%
Natural Language Processing • 33%
Computer Vision • 33%
Predictive Analytics • 34%
Finance • 25%
Healthcare • 25%
Automotive • 25%
Telecommunications • 25%
Finance • 25%
Healthcare • 25%
Automotive • 25%
Telecommunications • 25%
Automated content generation • 25%
Data analysis and insights • 25%
Real-time translation services • 25%
Customer service chatbots • 25%
Asia • 25%
North America • 25%
Rest of the World • 25%
Europe • 25%