Google claims new AI training tech is 13 times faster and 10 times more power efficient — DeepMind’s new JEST optimizes training data for impressive gains | Tom’s Hardware

Create a positive and light style image portraying the idea of AI and energy efficiency. The picture should capture a metaphorical scene where a small robotic figure, representing the lesser model, is meticulously sorting through batches of colorful spheres symbolizing different data points. The larger robot, representing the main training model, waits patiently and selectively picks up the chosen bunch. The scene should convey a process of efficient selection and training, symbolizing the JEST training method of DeepMind. In the background, display futuristic visual elements, hinting at advanced technology and low energy consumption. This image should adhere to a 3:2 aspect ratio.

Google’s DeepMind has introduced a new JEST training method for AI models, claiming to significantly enhance training speed and energy efficiency. This method, which focuses on batch training rather than individual data points, involves creating a smaller model to assess data quality and select the most suitable batches for training a larger model. The success of the JEST method hinges on the quality of the training data, making it challenging for amateur AI developers to implement. The timing of this research is crucial, given the increasing concerns about the environmental impact of AI data centers, with AI workloads already consuming a substantial amount of power. The adoption of JEST methods by major players in the AI space remains uncertain, but there are hopes that it could lower power consumption and training costs. However, the competition between cost savings and hyper-fast training output may ultimately determine its impact on the industry.

Full article

Leave a Reply