Advanced Computing in the Age of AI | Friday, April 19, 2024

Experts Grapple with AI’s Environmental Costs 

AI algorithms have made tremendous progress in making many computational processes more efficient, but “efficiency” isn’t always the name of the game for the AI industry itself. Famously, a team of researchers at the University of Massachusetts Amherst concluded that – at least as of a few years ago – a single training run for an AI model could have a carbon footprint on par with the lifespans of several cars. As use of AI has snowballed – with bigger and bigger models in play – that environmental cost is coming into sharper focus.

At the 2021 AI Hardware Summit, a panel titled “Ensuring Sustainability in AI Systems” faced this question head-on. Sitting on the panel: Carole-Jean Wu, a research scientist at Facebook; David Patterson, a distinguished engineer for Google Brain and co-winner of the 2017 Turing Award; and David Kanter, executive director of MLPerf.

“The resource demand of AI has been growing exponentially in the past decade,” Wu said. “It is quite obvious that the growth trend of AI is currently at an unsustainable pace – so while the demand stimulates significant market growth, it can really impact our environment in many different ways.”

Is AI training really the problem?

Patterson’s opening salvo challenged the conventional wisdom on AI’s carbon footprint, calling into question conclusions like those in the Amherst paper as he recalled an investigation he participated in on the carbon footprints of large neural network training. “There were papers that we cited … that were off by, kind of remarkably, by almost a factor of 100 too high,” he said. “I think if we’ve learned anything about our modern society, it’s really important that we agree on the facts.”

Citing a paper published in Science, Patterson recalled it concluded that “‘there is energy going into the cloud, but that’s taking energy consumption away from data centers, which are not as efficient as the cloud.’ So when they factored that in and they looked over the past decade, the energy consumption for data centers overall was only six percent despite having more than five times as much compute power, and the reason was [that] inefficient, underutilized data centers and buildings are getting replaced by highly efficient cloud.”

Further, Patterson said, while AI training does use energy, the nature and scale of that energy matters. “Several of these companies are being very aggressive about carbon footprint and trying to use carbon-free energy sources,” he said, citing strong sustainability efforts from Google, Microsoft and others. And for Google, he said, the percent of its energy use stemming from large-model training was less than 0.005 percent. “It’s round-off error,” he said. “It’s a very tiny piece of what Google does.”

The panelists also questioned whether training was the most important problem with AI’s carbon footprint. “My intuition is that inference is the biggest problem,” Kanter said, since it scaled with the number of customers. As a result, he said, an upfront carbon investment in training might be worthwhile if it yielded meaningful reductions on the inference side.

Zeroing in on manufacturing

Further, Wu said, in many cases, “the dominating source of computing’s environmental cost has shifted quite a bit … to hardware manufacturing” rather than product use – the result of many years of work on efficiency on the use side. Apple, for instance, found that 74 percent of its end-to-end carbon footprint stemmed from manufacturing rather than client use, with 33 percent coming from integrated circuit manufacturing.

To this point, Kanter noted that “client applications and business applications are very different,” arguing that the utilization factor from Apple’s clients was likely much lower than the utilization factor of, say, a cloud data center. Patterson, meanwhile, defended the relative sustainability of data center parts by juxtaposing their comparatively long lifetimes and higher utilization with the shorter lifetimes of consumer electronics like smartphones.

Manufacturing sustainability for AI hardware, though, opens up its own can of worms – because, as Wu said, “[carbon] emission is a piece of the puzzle.” The rest of the puzzle, she said, includes massive environmental issues like e-waste (50 million tons produced in 2019, according to Wu) and wastewater treatment. Kanter agreed: “When it comes to semiconductors, CO2 is an issue,” he said, “but there's a lot of other things that are issues, like gallium arsenide. … There’s a lot of ways these things can be hazardous outside of CO2.”

Developing solutions

To resolve these issues, the panelists had many ideas to improve the sustainability of AI systems. Kanter advocated for measurement and benchmarking. On the surface, he said, one might expect the efficiency of algorithms to progress at the rate of Moore’s law – but instead, MLPerf benchmarks had substantially outstripped it. “My point is that benchmarks help to drive things,” he said. “The point of a benchmark is that it aligns the whole industry on what ‘better’ means.”

Patterson agreed, citing his early interest in including power and training measurements in MLPerf . “Sadly, I didn’t win that argument,” he noted. “Engineers compete. If you add a metric that you can compete on, it will affect outcomes,” he said. “Performance-per-watt, rather than straight performance, would be a big help.” At Google, he had learned that Google Cloud would compete against Amazon and Microsoft in bids by citing its sustainability, so the marketing potential, he explained, is there.

The panelists also highlighted the importance of choosing sustainable production and operation facilities. “The biggest surprise in that paper I talked about was how important location was,” Patterson said. Wu agreed, stressing the difference that could be made by selecting an environmentally friendly fab or looking, generally, for facilities supplied by carbon-free energy.

Patterson closed on a note of personal responsibility, urging AI researchers to push for these measures within their companies. “If you really believe that global climate change is real, I think you should be asking what you can do individually as part of your job and as part of your personal behavior.”

EnterpriseAI