Loading...
Loading...
Browse all stories on DeepNewz
VisitWhich region will have the highest number of contributors to INTELLECT-1 by December 31, 2025?
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
Official report or announcement from PrimeIntellect detailing contributor statistics
PrimeIntellect Releases INTELLECT-1, First Decentralized 10B AI Model Trained on 1T Tokens with 30 Contributors Across 3 Continents
Nov 29, 2024, 10:14 PM
PrimeIntellect has announced the release of INTELLECT-1, the world's first fully decentralized large language model (LLM) with 10 billion parameters. This groundbreaking model was trained across three continents using over 100 NVIDIA H100 GPUs and involved 30 individual compute contributors. The release includes the base model, intermediate checkpoints, a pre-training dataset, and post-trained instruct models developed in collaboration with Arcee AI. The project is notable for its decentralized training framework, which challenges the traditional notion that AI compute needs to be centralized. The model has been evaluated on 1 trillion tokens, demonstrating promising results. The technical details and framework are available in an accompanying technical paper, and users can download the model for personal use.
View original story
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
Middle East • 25%
North America • 25%
Europe • 25%
Asia • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia-Pacific • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
North America • 25%
Asia • 25%
South America • 25%
Australia • 25%
Healthcare • 25%
Finance • 25%
Education • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
North America • 25%
Europe • 25%
Asia • 25%
Other • 25%
No • 50%
Yes • 50%
20 to 29 • 25%
Less than 10 • 25%
30 or more • 25%
10 to 19 • 25%