Loading...
Loading...
Browse all stories on DeepNewz
VisitFirst competitor to launch Infini-attention-like technology by 2024
OpenAI • 20%
Microsoft • 20%
Amazon • 20%
Facebook • 20%
IBM • 20%
Public announcements or credible tech news
Google Unveils 'Infini-attention' for Unlimited Context in LLMs
Apr 20, 2024, 03:52 PM
Google has introduced a groundbreaking technique in machine learning called 'Infini-attention', which enables handling of unlimited context in language models while maintaining a fixed-size memory footprint. This advancement, detailed in their latest papers 'Mixture of Depths' and 'Efficient Infinite Context Transformers with Infini-attention', aims to enhance the processing capabilities of pre-trained language models (LLMs) by applying local attention to input chunks and sequentially retaining relevant information. The development is seen as a significant step towards achieving very long context lengths, sufficient for most industry use cases.
View original story
Google • 33%
Microsoft • 33%
Facebook • 33%
Google • 25%
Amazon • 25%
Facebook • 25%
Apple • 25%
Microsoft • 25%
Apple • 25%
Facebook • 25%
Amazon • 25%
Google • 25%
Amazon • 25%
Microsoft • 25%
Samsung • 25%
OpenAI • 25%
DeepMind • 25%
IBM • 25%
Microsoft • 25%
No major competitor responds • 25%
Microsoft responds with a similar feature • 25%
Apple introduces a competing product • 25%
Amazon develops a related technology • 25%
Google • 25%
Apple • 25%
Facebook • 25%
None • 25%
Amazon • 33%
Microsoft • 33%
Samsung • 33%
Google • 33%
Apple • 33%
Microsoft • 33%
Microsoft • 25%
Google • 25%
Amazon • 25%
IBM • 25%
Financial Services • 25%
Natural Language Processing • 25%
Autonomous Vehicles • 25%
Healthcare • 25%