Loading...
Loading...
Browse all stories on DeepNewz
VisitMicrosoft Open-Sources bitnet.cpp, Achieves 6x Speed Improvements and 82% Energy Reduction
Oct 20, 2024, 10:00 AM
Microsoft has open-sourced bitnet.cpp, a highly efficient 1-bit large language model (LLM) inference framework that runs directly on CPUs. This groundbreaking technology allows for the deployment of large 100-billion parameter models on local devices with significant performance improvements. Bitnet.cpp offers up to 6x speed improvements on x86 CPUs and up to 5x on ARM CPUs, along with energy reductions between 71.9% to 82.2%. The framework, which quantizes models to 1.58 bits, supports running these models at 5-7 tokens per second. BitNet b1.58 was 4.1 times faster and 8.9 times higher throughput capable. This development marks a significant advancement in AI technology, making high-performance LLMs more accessible and energy-efficient.
View original story
GPT-4o • 33%
Gemini 1.5 • 33%
Claude 3.5 Sonnet • 34%
Yes • 50%
No • 50%
AMD Ryzen 7 9800X3D • 25%
Intel Core Ultra 9 285K • 25%
AMD Ryzen 7 7800X3D • 25%
Other • 25%
Intel Core Ultra 200V • 33%
Qualcomm Snapdragon X Elite • 33%
AMD Strix Point • 33%
Other • 1%
Gaming Performance • 25%
Power Efficiency • 25%
AI Capabilities • 25%
Battery Life • 25%
50,000 times more efficient • 25%
75,000 times more efficient • 25%
100,000 times more efficient • 25%
More than 100,000 times more efficient • 25%
Chatbots • 25%
Virtual assistants • 25%
Content generation • 25%
Coding • 25%
Python • 25%
JavaScript • 25%
Java • 25%
Other • 25%
Yes • 50%
No • 50%
Healthcare • 25%
Automotive • 25%
Consumer Electronics • 25%
Enterprise Software • 25%