Loading...
Loading...
Browse all stories on DeepNewz
VisitTop performing application domain for xLSTM by end of 2024
Natural Language Processing • 33%
Computer Vision • 33%
Predictive Analytics • 34%
Comparative studies published in AI research journals
Sepp Hochreiter Unveils xLSTM with New Cells, Competing with Transformers
May 8, 2024, 06:15 AM
AI pioneer Sepp Hochreiter has introduced a new machine learning architecture, the Extended Long Short-Term Memory (xLSTM), which aims to address the limitations of traditional LSTMs by incorporating sLSTM and mLSTM memory cells. The xLSTM features exponential gating, modified memory structures, and a new parallelizable LSTM design, enhancing its performance and scalability. This innovation positions xLSTM as a strong competitor to state-of-the-art Transformers and State Space Models. Hochreiter's team at NXAI is also working on building European LLMs using this new architecture.
View original story
North America • 25%
Europe • 25%
Asia • 25%
Rest of the World • 25%
Speed • 33%
Scalability • 33%
Accuracy • 34%
Customer service chatbots • 25%
Automated content generation • 25%
Real-time translation services • 25%
Data analysis and insights • 25%
Finance • 25%
Healthcare • 25%
Automotive • 25%
Telecommunications • 25%
Widespread adoption • 33%
Moderate adoption • 33%
Limited adoption • 34%
Healthcare • 25%
Finance • 25%
Natural Language Processing • 25%
Computer Vision • 25%
Natural Language Processing • 25%
Computer Vision • 25%
Reinforcement Learning • 25%
Other • 25%
Finance • 25%
Telecommunications • 25%
Automotive • 25%
Healthcare • 25%