Reducing emissions is one of the current trends. People save energy when heating, recycle glass and plastic, choose public transport or a bicycle instead of a car. But in the meantime, new sources of carbon dioxide emissions are emerging, which are underestimated or even not known about at all. For example, artificial intelligence.
In a new study, researchers from Carnegie Mellon University described how much current artificial intelligence models contribute to climate change. Currently, they are used by tens, and perhaps even hundreds of millions of users around the world every day.
This research is the first systematic comparison of the energy costs associated with machine learning models. Scientists in studiedwhich has yet to go through peer review, found that using an artificial intelligence model to generate a single image requires about the same amount of energy as charging a smartphone.
“People think that AI has no environmental impact, that it’s an abstract technological entity that lives somewhere in the ‘cloud,'” said team leader Alexandra Luccioni. “But every inquiry into an artificial intelligence model has a cost to the planet that is important to calculate.”
Her team tested 30 data sets using 88 models and found that there were significant differences in electricity consumption, and therefore greenhouse gas emissions, between the different models. They thus measured the amount of carbon dioxide emissions used for one task.
It turned out that the Stability AI Stable Diffusion XL image generator consumed the most power. During one task, he produced almost 1600 grams of carbon dioxide. According to Luccioniová, this roughly corresponds to driving four kilometers in a gasoline-powered car. On the contrary, the smallest emissions were associated with artificial intelligences that can write text tasks.
Generative tasks that create new content, such as images and summarizing some text, are generally more energy- and therefore carbon-intensive than tasks that just sort data, the researchers said. This corresponds, for example, to the classification of films.
The authors also observed that using multi-purpose models to perform discrimination tasks is more energy intensive than using task-specific models. This is important, according to the researchers, given current trends in the use of models.
“We consider this last point to be the most compelling conclusion of our study, given the current paradigm shift from smaller models tuned for a specific task to multitasking models deployed to respond to a barrage of user queries in real-time,” they said in a report .
Humans use AI senselessly
According to Luccioni, this use of artificial intelligence is pointless in terms of energy and emissions: “If you’re doing a certain application, like searching e-mail, do you really need these big models that are capable of everything? I would say not.’
While the numbers on the carbon dioxide consumption of such tasks may seem small, when multiplied by the millions of users who rely on AI-powered programs every day, often with multiple demands, they show sums that could have a significant impact on efforts to reduce environmental waste .
“I think with generative artificial intelligence in general, we should be aware of where and how we use it and compare its costs and benefits,” added Luccioni.
2023-12-05 11:28:53
#Artificial #intelligence #giant #carbon #footprint #Creating #image #consumes #amount #electricity #recharging #cell #phone