During a recent public appearance, OpenAI CEO Sam Altman admitted that he wants a large chunk of the world's power grid to help him run artificial intelligence models.
As Laptop Mag flagged, he dropped that bomb dropped during AMD's AI conference last week after Lisa Su, the CEO of the hosting firm who counts Altman as a client and friend, mentioned ChatGPT's recent outages.
Though OpenAI hasn't revealed the exact causes of its massive June outage, there's a good chance it had to do with running out of computing power. This seems all the more probable given that Altman admitted earlier this year that the company had run out of graphics processing units or GPUs, the high-end computer chips that AMD sells and companies like OpenAI use to power their large language models (LLMs).
Speaking to that likelihood, Su asked Altman, "are there ever going to be enough GPUs?"
With a chuckle, the inscrutable executive paused before responding — and then essentially said the quiet part out loud.
"Theoretically, at some points, you can see that a significant fraction of the power on Earth should be spent running AI compute," Altman said. "And maybe we're going to get there."
To reiterate: the CEO of the world's largest AI company said he believes a "significant fraction" of the electricity on this planet should be used to run AI — and said so to the CEO of a company whose GPUs he recently committed to purchasing, too.
Though Su moved on quickly from the exchange, the undercurrent beneath what Altman admitted is, to paraphrase Laptop Mag's Madeline Ricchiuto, low key nightmare fuel.
Perhaps most upsetting about Altman's flippant admission is the environmental impact he so casually ignored. Conventional electric generation often relies on the combustion of fossil fuels, which have been killing our planet since way before OpenAI was a twinkle in Altman's eye.
Add in a new electricity-guzzling industry like AI to a power grid already stretched to the brink, and you've got a serious problem — one that Altman, Su, and everyone else who boosts AI seems to not want to face full-on.
In a new blog post in which the OpenAI CEO claimed that the world is approaching what he calls a "gentle singularity," or the point at which artificial intelligence meets or surpasses the capabilities of humans, Altman attempted to explain how much power ChatGPT uses — but his description fell short.
"People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes," the CEO wrote. "It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon."
ChatGPT's emissions being broken down into their smallest units, notably, obfuscates more legitimate projections of the chatbot's actual energy toll.
According to a recent study conducted by the University of California, Riverside and the Washington Post, ChatGPT already uses nearly 40 million kilowatts of energy per day, which is enough to power the empire state building for about 18 months, or charge eight million smartphones. Notably, those figures don't take into account any other LLMs or AI systems, meaning the real environmental impact of AI is even greater.
Despite how much of the world's energy he's already taken, Altman is saying that he and his fellow travelers will need more and more to keep their hallucination machines running — and as Su suggested, there may never be enough GPUs to satisfy that power hunger.
More on AI and energy: Former Google CEO Tells Congress That 99 Percent of All Electricity Will Be Used to Power Superintelligent AI
Share This Article