Wow, that's *bad.*

Up in the Air

By now, you're probably well aware of the staggering energy and resource costs of generative AI. But even if the whole industry is a bubble ready to burst, chances are that the environmental toll we're hearing about now is only going to get worse — because AI's appetite is absolutely insatiable.

Consider the obscene amounts of water that's needed just to cool the data centers that train and host generative AI models, which is somewhere in the millions of gallons per year. Internal estimates from Microsoft about its data facility in Goodyear, Arizona, for example, show that it's set to annually consume 56 million gallons of drinking water — which is more than a drop in the ocean for such a water-scarce region.

But as Wired reports, the way data centers waste water is even worse than how households would waste it by leaving the tap running.

"The water that is available for people to use is very limited," Shaolei Ren, a responsible AI researcher at UC Riverside, told the magazine. "It's just the fresh surface water and groundwater. Those data centers, they're just evaporating water into the air."

"When we get the water from the utility, and then we discharge the water back to the sewage immediately, we are just withdrawing water — we're not consuming water," Ren continued. "A data center takes the water from this utility, and they evaporate the water into the sky, into the atmosphere."

And once evaporated, that water doesn't come back to Earth for another year.

Down the Drain

What do we get for all this consumption? The technology has many applications, ranging from chatbots to image generators. But for the average internet user, it's probably Google's dubious AI search summaries, which are so advanced that they'll recommend that you add glue to pizza.

Call wacky hallucinations like that an outlier, but it's not unreasonable to say that the AI summaries offer no significant benefits over conventional search results other than concision at the cost of accuracy, while having clear and enormous drawbacks in efficiency.

"In the back end, these algorithms that need to be running for any generative AI model are fundamentally very, very different from the traditional kind of Google Search or email," Sajjad Moazeni, an AI researcher at the University of Washington, told Wired.

"For basic services, those were very light in terms of the amount of data that needed to go back and forth between the processors," he said, estimating that these generative AI applications are 100 to 1,000 more times intensive. Other estimates put Google's AI search in the ballpark of consuming ten times as much energy than a regular one.

It's more important than ever, then, to be scrutinizing these costs. Google, as star-struck by AI as every other tech leader, has cynically stopped pretending to be a carbon neutral company so it can keep shoving chatbots in everyone's faces. If that's any indication of how the rest of the industry's attitude towards the problem, it augurs ill.

More on AI: Washington Post Launches AI to Answer Climate Questions, But It Won't Say Whether AI Is Bad for the Climate


Share This Article