Loading...
Loading...
Browse all stories on DeepNewz
VisitWill FineWeb surpass RefinedWeb in usage by end of 2024?
Yes • 50%
No • 50%
Market analysis reports and usage statistics from HuggingFace and other relevant datasets' platforms.
HuggingFace Launches FineWeb, a 15T Token Dataset Outperforming Others
Apr 21, 2024, 08:02 AM
HuggingFace has released a new dataset named FineWeb, available at HuggingFace, which comprises 15 trillion tokens of high-quality, deduplicated web data sourced from CommonCrawl spanning from 2013 to 2024. This 275GB dataset, available under an Open Data Commons license, has shown to outperform existing datasets such as RefinedWeb, C4, DolmaV1.6, The Pile, and SlimPajama in various benchmarks, including 350B token ablations. FineWeb has been utilized to train models like LLaMA-3, demonstrating significant improvements due to its scale and quality. The dataset is expected to support extensive training runs given its size and has been made open-source for broad accessibility.
View original story
Yes • 50%
No • 50%
Increase by >10% • 25%
Increase by <=10% • 25%
Decrease by <=10% • 25%
Decrease by >10% • 25%
X Leads • 33%
X Matches • 33%
X Lags • 33%
FineWeb • 20%
The Pile • 20%
DolmaV1.6 • 20%
C4 • 20%
RefinedWeb • 20%
Telecommunications • 25%
Healthcare • 25%
Finance • 25%
Automotive • 25%