Up to 30% of the power used to train AI is wasted: A software tool could help fix that

A less wasteful way to train large language models, such as the GPT series, finishes in the same amount of time for up to 30% less energy, according to a new study from the University of Michigan.

This post was originally published on this website.