Loading...
Loading...
Browse all stories on DeepNewz
VisitSepp Hochreiter's xLSTM: A New Rival to Transformers Under NXAI Initiative
May 8, 2024, 06:26 AM
AI pioneer Sepp Hochreiter has introduced a new architecture called xLSTM, or Extended Long Short-Term Memory, which aims to address the limitations of traditional LSTMs and compete with state-of-the-art language models like Transformers. The xLSTM incorporates innovations such as exponential gating, modified memory structures, and introduces sLSTM and mLSTM memory cells, enhancing its performance and scalability. This development is part of a broader effort to advance European language model capabilities under the NXAI initiative. The new model, which includes a parallelizable LSTM, has generated significant interest in the AI community, with discussions around its potential to outperform existing models. However, no code or weights have been shared yet.
View original story
Markets
No • 50%
Yes • 50%
Public announcements or earnings calls from major tech companies
Yes • 50%
No • 50%
Official announcements from Sepp Hochreiter or NXAI initiative
No • 50%
Yes • 50%
Published results in a peer-reviewed AI journal or major AI conference
Automated content generation • 25%
Data analysis and insights • 25%
Real-time translation services • 25%
Customer service chatbots • 25%
Industry reports, tech company announcements
Asia • 25%
North America • 25%
Rest of the World • 25%
Europe • 25%
Academic publications and tech industry reports
Apple • 20%
Google • 20%
Amazon • 20%
Facebook • 20%
Microsoft • 20%
Public announcements or press releases from tech companies