Loading...
Loading...
Browse all stories on DeepNewz
VisitWill GoogleDeepMind's JEST AI training technique be integrated into a major open-source AI framework by the end of 2024?
Yes • 50%
No • 50%
Official repositories on GitHub or announcements from the maintainers of major open-source AI frameworks
GoogleDeepMind's JEST AI Training Technique Cuts Training Time by 13x, Reduces Energy Use by 90%
Jul 6, 2024, 02:13 AM
Google has introduced a new AI training technique called JEST, which significantly reduces the time and computational power required for training AI models. The JEST method is up to 13 times faster and 10 times more energy-efficient than previous techniques, with a 90% decrease in computing power demand. This innovation, first revealed in a research paper in April, is designed to be particularly efficient in generating code and is beneficial for larger model sizes. The method, developed by GoogleDeepMind, uses a pretrained reference model to select data subsets for training based on their 'collective learnability,' which drastically cuts down on the resources needed for AI training. This development has the potential to revolutionize the AI industry by making the training process more efficient and environmentally friendly.
View original story
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
No • 50%
No • 50%
Yes • 50%
No • 50%
Yes • 50%
TensorFlow • 25%
Other • 25%
JAX • 25%
PyTorch • 25%
Other • 25%
Microsoft • 25%
Amazon • 25%
Meta • 25%