Loading...
Loading...
Browse all stories on DeepNewz
VisitWhich major AI platform will first integrate BitsFusion by June 30, 2025?
OpenAI • 25%
Google • 25%
Meta • 25%
Other • 25%
Official announcements from AI platforms
BitsFusion Achieves 7.9X Compression of Stable Diffusion v1.5 to 1.99 Bits with Improved Performance
Jun 7, 2024, 07:20 AM
BitsFusion is a new weight quantization method that compresses the UNet of Stable Diffusion v1.5 from 1.72 GB (FP16) to 219 MB (1.99 bits), achieving a 7.9X compression ratio. This method not only reduces the model size significantly but also improves its performance. The paper introduces a mixed-precision strategy to achieve this compression, transitioning from FP32, making diffusion-based image generation models more efficient and capable of synthesizing high-quality content.
View original story
Google • 20%
Microsoft • 20%
Amazon • 20%
Samsung • 20%
None • 20%
Google • 25%
Meta • 25%
Microsoft • 25%
OpenAI • 25%
Yes, further AI integrations announced • 50%
No further AI integrations announced • 50%
Yes • 50%
No • 50%
Phi-3 series • 25%
Mixtral series • 25%
GPT series • 25%
Llama series • 25%
OpenAI • 25%
Google • 25%
Meta • 25%
None • 25%
Image Generation • 25%
Other • 25%
Natural Language Processing • 25%
Video Compression • 25%