Loading...
Loading...
Browse all stories on DeepNewz
VisitModernBERT Released: State-of-the-Art Encoder Model with 8,192 Token Support, 2 Trillion Tokens, and 2-3x Faster Processing
Dec 19, 2024, 04:51 PM
ModernBERT, a new family of state-of-the-art encoder-only models, has been released by AnswerAI and LightOn. This updated version of BERT features a sequence length of 8,192 tokens and is designed for improved performance in tasks such as classification and retrieval. Trained on 2 trillion tokens, ModernBERT is reported to be twice as fast as DeBERTaV3 on short contexts and three times faster than NomicBERT and GTE on long contexts. The model is available in two sizes: ModernBERT-base with 149 million parameters and ModernBERT-large with 395 million parameters. This release marks a significant upgrade to traditional BERT models, which have not seen a major overhaul in over six years.
View original story
Yes • 50%
No • 50%
Molmo has fewer citations than GPT-4V • 25%
Molmo has 0-10% more citations than GPT-4V • 25%
Molmo has 10-20% more citations than GPT-4V • 25%
Molmo has more than 20% more citations than GPT-4V • 25%
Meta (Llama 3) • 25%
OpenAI (GPT-4o) • 25%
Anthropic (Claude 3.5 Sonnet) • 25%
Other • 25%
ContextCite • 25%
LongCite-8B • 25%
LongCite-9B • 25%
GPT-4o • 25%
Less than 50 • 25%
50-100 • 25%
101-200 • 25%
More than 200 • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Text Classification • 25%
Other • 25%
Sentiment Analysis • 25%
Information Retrieval • 25%
Technology • 25%
Healthcare • 25%
Finance • 25%
Education • 25%