Related Story

    China's DeepSeek Unveils 671B-Parameter AI Model Rivaling OpenAI, Trained in Two Months

    China's DeepSeek Unveils 671B-Parameter AI Model Rivaling OpenAI, Trained in Two Months

    84 postsChinaAI ModelingWorldAI

    Chinese AI startup DeepSeek has caused a stir in Silicon Valley after releasing open-source AI models that rival those of leading U.S. companies like OpenAI and Google. DeepSeek's latest model, DeepSeek-R1, contains 671 billion parameters and matches or outperforms OpenAI's o1 model on certain benchmarks, despite being developed at a fraction of the cost and without access to cutting-edge chips restricted by U.S. export controls. The company reportedly used 2,048 Nvidia H800 GPUs to train DeepSeek-R1 in just two months, spending only $5.5 million to $6 million—significantly less than the hundreds of millions reportedly spent by U.S. firms. DeepSeek's success highlights the potential for innovation through efficient engineering, including the use of reinforcement learning, and challenges the notion that massive computing resources are essential for advanced AI development. Co-founded by Liang Wenfeng, the Hangzhou-based company's rapid progress has ignited concerns in the U.S. tech industry, with reports of companies like Meta analyzing DeepSeek's model to glean insights. DeepSeek's achievements underscore China's growing competitiveness in artificial intelligence despite U.S. export controls on advanced chips.

    Proposed Market

    Which company will first acknowledge using DeepSeek insights by end of 2025?

    Which company will first acknowledge using DeepSeek insights by end of 2025?

    2
    GoogleOpenAISilicon ValleyDeepSeekNvidia H800HangzhouLiang WenfengMetaChina

    Description

    Company announcements or press releases

    Market Options