Loading...
Loading...
Browse all stories on DeepNewz
VisitWhat will be the most significant improvement of xLSTM over traditional LSTM by end of 2024?
Speed • 33%
Scalability • 33%
Accuracy • 34%
Comparative studies published in AI research journals
Sepp Hochreiter Unveils xLSTM with Parallelizable LSTM, Advancing ML
May 8, 2024, 05:11 AM
AI pioneer Sepp Hochreiter has introduced a new machine learning model, the Extended Long Short-Term Memory (xLSTM), which aims to address the limitations of traditional LSTM networks. The xLSTM incorporates advanced features such as exponential gating and modified memory structures, enhancing its performance and scalability. This new architecture, which includes specialized sLSTM and mLSTM memory cells and a parallelizable LSTM, positions xLSTM as a strong competitor to state-of-the-art Transformers and State Space Models. Hochreiter, who has been working on LSTM for over 30 years, and his team at NXAI are developing European LLMs using this technology, marking a significant advancement in the field.
View original story
10% Improvement • 33%
20% Improvement • 33%
30% or more Improvement • 34%
Natural Language Processing • 33%
Computer Vision • 33%
Predictive Analytics • 34%
North America • 25%
Europe • 25%
Asia • 25%
Rest of the World • 25%
Customer service chatbots • 25%
Automated content generation • 25%
Real-time translation services • 25%
Data analysis and insights • 25%
Limited adoption • 34%
Moderate adoption • 33%
Widespread adoption • 33%