Will DeepSeek secure a major partnership with a Fortune 500 company by end of 2025?
Yes • 50%
No • 50%
Official press releases from DeepSeek or the partnering company
DeepSeek Releases MIT-Licensed 685B DeepSeek-R1 Model Rivaling OpenAI's o1 at 30x Lower Cost
Jan 20, 2025, 12:34 PM
DeepSeek has officially released its new open-source reasoning models, DeepSeek-R1 and DeepSeek-R1-Zero, licensed under the MIT License. The models, boasting 685 billion parameters, perform on par with OpenAI's o1 model across math, code, and reasoning tasks. DeepSeek-R1 achieved 71.0% pass@1 on AIME 2024, comparable to OpenAI's o1, and reached 86.7% with majority voting, surpassing OpenAI's o1. The release includes a technical report detailing a novel training pipeline that uses reinforcement learning without supervised fine-tuning, incorporating a 'language consistency reward' to improve reasoning outputs. The models are available on Hugging Face, the DeepSeek website, and API, as well as in their chat and Android/iOS apps. DeepSeek also released distilled smaller models, including one based on Qwen-1.5B that outperforms GPT-4o and Claude-3.5-Sonnet on math benchmarks with 28.9% on AIME and 83.9% on MATH. The models offer significant cost savings, with DeepSeek-R1 being up to 30 times cheaper than OpenAI's o1.
View original story
No • 50%
Yes • 50%
Microsoft • 25%
Other • 25%
Alibaba • 25%
Meta • 25%
No • 50%
Yes • 50%
Amazon • 25%
Microsoft • 25%
Google • 25%
Other • 25%
26-50% • 25%
51% or more • 25%
0-10% • 25%
11-25% • 25%
Above 90% • 25%
Below 80% • 25%
80%-85% • 25%
85%-90% • 25%
Below 10 • 25%
Top 1 • 25%
Top 2-5 • 25%
Top 6-10 • 25%