September 30, 2024 by Emily MacKinnon, Dalhousie University

Collected at: https://techxplore.com/news/2024-09-ai-doesnt-environment.html

Artificial intelligence (AI) has changed the world as we know it. It’s been used for everything from health-care monitoring to writing speeches. But the technology’s impact on the environment is becoming a serious concern.

ChatGPT, one of the most familiar AI models, is a form of generative AI that uses natural language processing to respond to user queries in a chatbot-style web interface.

When OpenAI, the company that created ChatGPT, was training the third generation of their model (that is, teaching it what content to generate against users’ questions), it used enough electricity to power 120 Canadian homes for an entire year.

And training is just one aspect of an AI model’s emissions. The largest contributor over time is model inference, or the process of running the model live. Large language models like ChatGPT run constantly, waiting for a user to ask a question.

The data centers required to power these models currently account for three percent of global energy consumption, they rarely use renewable energy sources, and, according to Forbes, are emitting as much CO2 as the entire country of Brazil.

Enter Dr. Tushar Sharma, an assistant professor in Dalhousie’s Faculty of Computer Science.

Dr. Sharma’s research focuses on sustainable AI and software engineering. In other words: he ensures the source code that builds and runs these models is as clean and efficient as possible. When they are not, he identifies and fixes them.

Dr. Sharma’s SMART Lab recently published a study in ACM Transactions on Software Engineering and Methodology detailing how to measure an AI model’s energy consumption on a granular level by identifying which parts of the code are the most power hungry. (Think of your home’s power bill: it shows the home’s energy consumption writ large, but it doesn’t typically break down which appliances are drawing the most electricity.)

In another study, his lab sifted through dozens of layers of code within AI models to “prune” tokens that were no longer relevant, useful, or effective.

“We move strategically through each layer of these big models and reduce the required computation inside,” he explains.

The idea is to train the models more efficiently, so the electrical draw and subsequent emissions are reduced. “We are trying to not have to use as much power or time, which leads to an energy reduction or a reduction in carbon emissions,” he says. “The ideal scenario is that we are reducing the energy required to train or operate these systems without sacrificing the benefits.”

So is AI worth it?

Dr. Christian Blouin, acting dean of Dal’s Faculty of Computer Science, says AI has the potential to transform the world as we know it, and it’s going to happen whether we make the technology greener or not.

“We have a responsibility to find a better way to tackle important problems that require less resources,” he says. “As people discover new ways to leverage AI, it is critical that we develop the computer science to make it more sustainable.”

This balance is especially important for people who work within the climate sector. Dr. Anya Waite is the CEO and scientific director of the Ocean Frontier Institute (OFI), a research institute at Dal. OFI researches the ocean’s changing role in our climate system and delivers solutions for climate change mitigation.

Dr. Waite says that while AI is a critical tool to manage data and improve efficiency and accuracy, it becomes unsustainable if we end up spending more energy than we save from its use.

“Dr. Sharma’s work is critical because it supports AI efficiency and lowers its cost and carbon footprint,” she says. “Ultimately, without work such as Dr. Sharma’s, we risk losing the ability to launch new innovations, and we could miss out on the major benefits they provide.”

A tricky balance

Dr. Michael Freund is the director of Dal’s Clean Technologies Research Institute (CTRI), and he says users are not always aware of the infrastructure and operations required to support the technology they use.

“Responsible growth of AI must consider environmental factors,” Dr. Freund says. “It must require efficient operation, including more efficient code, responsible use and coupling data centers to green energy sources.”

It’s a tricky balance, he acknowledges, because, like OFI, CTRI often uses AI to increase efficiency of operations.

“Work by researchers such as Dr. Sharma will shed light on the true value of AI and inform decisions on how it is developed and used,” he says.

A green AI future

Converting the data centers to using renewable energy sources is another big hurdle, and Dr. Sharma says research like his coupled with solar, wind and hydro power, will make AI greener.

“All of these techniques are ultimately helping to achieve this goal of green AI and figuring out how we can keep using these machine learning models, but at a lower energy cost.”

More information: Saurabhsingh Rajput et al, Enhancing Energy-Awareness in Deep Learning through Fine-Grained Energy Measurement, ACM Transactions on Software Engineering and Methodology (2024). DOI: 10.1145/3680470

Leave a Reply

Your email address will not be published. Required fields are marked *

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments