By now, most of us should be vaguely aware that artificial intelligence is hungry for power.

Even if you don't know the exact numbers, the charge that "AI is bad for the environment" is well-documented, bubbling from sources ranging from mainstream press to pop-science YouTube channels to tech trade media.

Still, the AI industry as we know it today is young. Though startups and big tech firms have been plugging away on large language models (LLMs) since the 2010s, the release of consumer generative AI in late 2022 brought about a huge increase in AI adoption, leading to an unprecedented "AI boom."

In under three years, AI has come to dominate global tech spending in ways researchers are just starting to quantify. In 2024, for example, AI companies nabbed 45 percent of all US venture capital tech investments, up from only nine percent in 2022. Medium-term, big-name consultant firms like McKinsey expect AI infrastructure spending to grow to $6.7 trillion by 2030; compare this to just $450 billion in 2022.

That being the case, research on AI's climate and environmental impacts can seem vague and scattered, as analysts race to establish concrete environmental trends in the extraordinary explosion of the AI industry.

A new survey by MIT Technology Review is trying to change that. The authors spoke to two dozen AI experts working to uncover the tech's climate impact, combed "hundreds of pages" of data and reports, and probed the top developers of LLM tools in order to provide a "comprehensive look" at the industry's impact.

"Ultimately, we found that the common understanding of AI’s energy consumption is full of holes," the authors wrote. That led them to start small, looking at the energy use of a single LLM query.

Beginning with text-based LLMs, they found that model size directly predicted energy demand, as bigger LLMs use more chips — and therefore more energy — to process questions. While smaller models like Meta's Llama 3.1 8B used roughly 57 joules per response (or 114 joules when the authors factored for cooling power and other energy needs), larger units needed 3,353 joules (or 6,706), or in MIT Tech's point of reference, enough to run a microwave for eight seconds.

Image-generating AI models, like Stable Diffusion 3 Medium, needed 1,141 joules (or 2,282) on average to spit out a standard 1024 x 1024 pixel image — the type that are rapidly strangling the internet. Doubling the quality of the image likewise doubles the energy use to 4,402 joules, worth over five seconds of microwave warming time, still less than the largest language bot.

Video generation is where the sparks really start flying. The lowest-quality AI video software, a nine-month old version of Code Carbon, took an eye-watering 109,000 joules to spew out a low-quality, 8fps film — "more like a GIF than a video," the authors noted.

Better models use a lot more. With a recent update, that same tool takes 3.4 million joules to spit out a five-second, 16fps video, equivalent to running a microwave for over an hour.

Whether any of those numbers amount to a lot or a little is open to debate. Running the microwave for a few seconds isn't much, but if everybody starts doing so hundreds of times a day — or in the case of video, for hours at a time — it'll make a huge impact in the world's power consumption. And of course, the AI industry is currently trending toward models that use more power, not less.

Zooming out, the MIT Tech survey also highlights some concerning trends.

One is the overall rise in power use correlating to the rise of AI. While data center power use remained mostly steady across the US between 2005 and 2017, their power consumption doubled by 2023, our first full year with mass-market AI.

As of 2024, 4.4 percent of all energy consumed in the US went toward data centers. Meanwhile, data centers' carbon intensity — the amount of iceberg-melting exhaust spewed relative to energy used — became 48 percent higher than the US average.

All that said, the MIT authors have a few caveats.

First, we can't look under the hood at closed-source AI models like OpenAI's ChatGPT, and most of the leading AI titans have declined to join in on good-faith climate mapping initiatives like AI Energy Score. Until that changes, any attempt to map such a company's climate impact is a stab in the dark at best.

In addition, the survey's writers note that data centers are not inherently bad for the environment. "If all data centers were hooked up to solar panels and ran only when the Sun was shining, the world would be talking a lot less about AI’s energy consumption," they wrote. But unfortunately, "that's not the case."

In countries like the US, the energy grid used to power data centers is still heavily reliant on fossil fuels, and surging demand for immediate energy are only making that worse. For example, the authors point to Elon Musk's xAI data center outside of Memphis, which is is using 35 methane gas generators to keep its chips humming, rather than wait for approval to draw from the civilian power grid.

Unless the industry is made to adopt strategies to mitigate AI's climate impact — like those outlined in the Paris AI Action Declaration — this will just be the beginning of a devastating rise in climate-altering emissions.

More on AI: New Law Would Ban All AI Regulation for a Decade


Share This Article