Loading...
Loading...
Browse all stories on DeepNewz
VisitSepp Hochreiter Unveils xLSTM with Parallelizable LSTM, Advancing ML
May 8, 2024, 05:11 AM
AI pioneer Sepp Hochreiter has introduced a new machine learning model, the Extended Long Short-Term Memory (xLSTM), which aims to address the limitations of traditional LSTM networks. The xLSTM incorporates advanced features such as exponential gating and modified memory structures, enhancing its performance and scalability. This new architecture, which includes specialized sLSTM and mLSTM memory cells and a parallelizable LSTM, positions xLSTM as a strong competitor to state-of-the-art Transformers and State Space Models. Hochreiter, who has been working on LSTM for over 30 years, and his team at NXAI are developing European LLMs using this technology, marking a significant advancement in the field.
View original story
Markets
No • 50%
Yes • 50%
Official announcements from tech companies or credible technology news sources
No • 50%
Yes • 50%
Case studies or performance reports from companies using xLSTM
Yes • 50%
No • 50%
Published results in a peer-reviewed AI or machine learning journal
Limited adoption • 34%
Moderate adoption • 33%
Widespread adoption • 33%
Surveys or reports from European AI consortia
Speed • 33%
Accuracy • 34%
Scalability • 33%
Comparative studies published in AI research journals
Automotive • 25%
Finance • 25%
Healthcare • 25%
Telecommunications • 25%
Market analysis reports or major industry announcements