Loading...
Loading...
Browse all stories on DeepNewz
VisitWill xLSTM outperform traditional LSTM in commercial applications by 2025?
Yes • 50%
No • 50%
Case studies or performance reports from companies using xLSTM
Sepp Hochreiter Unveils xLSTM with Parallelizable LSTM, Advancing ML
May 8, 2024, 05:11 AM
AI pioneer Sepp Hochreiter has introduced a new machine learning model, the Extended Long Short-Term Memory (xLSTM), which aims to address the limitations of traditional LSTM networks. The xLSTM incorporates advanced features such as exponential gating and modified memory structures, enhancing its performance and scalability. This new architecture, which includes specialized sLSTM and mLSTM memory cells and a parallelizable LSTM, positions xLSTM as a strong competitor to state-of-the-art Transformers and State Space Models. Hochreiter, who has been working on LSTM for over 30 years, and his team at NXAI are developing European LLMs using this technology, marking a significant advancement in the field.
View original story
10% Improvement • 33%
20% Improvement • 33%
30% or more Improvement • 34%
Customer service chatbots • 25%
Automated content generation • 25%
Real-time translation services • 25%
Data analysis and insights • 25%
Finance • 25%
Healthcare • 25%
Automotive • 25%
Telecommunications • 25%
North America • 25%
Europe • 25%
Asia • 25%
Rest of the World • 25%
Natural Language Processing • 33%
Computer Vision • 33%
Predictive Analytics • 34%
Google • 20%
Amazon • 20%
Facebook • 20%
Apple • 20%
Microsoft • 20%
Limited adoption • 34%
Moderate adoption • 33%
Widespread adoption • 33%
Speed • 33%
Accuracy • 34%
Scalability • 33%