Loading...
Loading...
Browse all stories on DeepNewz
VisitGoogleDeepMind's JEST AI Training Technique Cuts Training Time by 13x, Reduces Energy Use by 90%
Jul 6, 2024, 02:13 AM
Google has introduced a new AI training technique called JEST, which significantly reduces the time and computational power required for training AI models. The JEST method is up to 13 times faster and 10 times more energy-efficient than previous techniques, with a 90% decrease in computing power demand. This innovation, first revealed in a research paper in April, is designed to be particularly efficient in generating code and is beneficial for larger model sizes. The method, developed by GoogleDeepMind, uses a pretrained reference model to select data subsets for training based on their 'collective learnability,' which drastically cuts down on the resources needed for AI training. This development has the potential to revolutionize the AI industry by making the training process more efficient and environmentally friendly.
View original story
Markets
No • 50%
Yes • 50%
Official announcements from major tech companies or credible news sources
No • 50%
Yes • 50%
Academic databases like Google Scholar or PubMed
Yes • 50%
No • 50%
Official repositories on GitHub or announcements from the maintainers of major open-source AI frameworks
TensorFlow • 25%
Other • 25%
JAX • 25%
PyTorch • 25%
Official repositories on GitHub or announcements from the maintainers of major open-source AI frameworks
Other • 25%
Microsoft • 25%
Amazon • 25%
Meta • 25%
Official announcements from the companies or credible news sources
Technology • 25%
Other • 25%
Healthcare • 25%
Finance • 25%
Industry reports and performance metrics from credible sources