Loading...
Loading...
Browse all stories on DeepNewz
VisitMicrosoft Open-Sources bitnet.cpp, Achieves 6x Speed Improvements and 82% Energy Reduction
Oct 20, 2024, 10:00 AM
Microsoft has open-sourced bitnet.cpp, a highly efficient 1-bit large language model (LLM) inference framework that runs directly on CPUs. This groundbreaking technology allows for the deployment of large 100-billion parameter models on local devices with significant performance improvements. Bitnet.cpp offers up to 6x speed improvements on x86 CPUs and up to 5x on ARM CPUs, along with energy reductions between 71.9% to 82.2%. The framework, which quantizes models to 1.58 bits, supports running these models at 5-7 tokens per second. BitNet b1.58 was 4.1 times faster and 8.9 times higher throughput capable. This development marks a significant advancement in AI technology, making high-performance LLMs more accessible and energy-efficient.
View original story
Google • 25%
Meta • 25%
Amazon • 25%
Other • 25%
Yes • 50%
No • 50%
Increases significantly • 25%
Increases slightly • 25%
Remains stable • 25%
Decreases • 25%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Microsoft • 25%
Amazon • 25%
Meta • 25%
Other • 25%
RISC-V • 25%
x86 • 25%
Other • 25%
ARM • 25%
Healthcare • 25%
Automotive • 25%
Consumer Electronics • 25%
Enterprise Software • 25%