Loading...
Loading...
Browse all stories on DeepNewz
VisitWill a major tech company adopt xLSTM for language tasks by end of 2025?
Yes • 50%
No • 50%
Public announcements or earnings calls from major tech companies
Sepp Hochreiter's xLSTM: A New Rival to Transformers Under NXAI Initiative
May 8, 2024, 06:26 AM
AI pioneer Sepp Hochreiter has introduced a new architecture called xLSTM, or Extended Long Short-Term Memory, which aims to address the limitations of traditional LSTMs and compete with state-of-the-art language models like Transformers. The xLSTM incorporates innovations such as exponential gating, modified memory structures, and introduces sLSTM and mLSTM memory cells, enhancing its performance and scalability. This development is part of a broader effort to advance European language model capabilities under the NXAI initiative. The new model, which includes a parallelizable LSTM, has generated significant interest in the AI community, with discussions around its potential to outperform existing models. However, no code or weights have been shared yet.
View original story
Finance • 25%
Healthcare • 25%
Automotive • 25%
Telecommunications • 25%
Finance • 25%
Healthcare • 25%
Automotive • 25%
Telecommunications • 25%
Widespread adoption • 33%
Moderate adoption • 33%
Limited adoption • 34%
Natural Language Processing • 33%
Computer Vision • 33%
Predictive Analytics • 34%
Automated content generation • 25%
Data analysis and insights • 25%
Real-time translation services • 25%
Customer service chatbots • 25%
Asia • 25%
North America • 25%
Rest of the World • 25%
Europe • 25%