As the likes of Microsoft, which recently unveiled its ChatGPT powered Bing search engine, usher in an era of ubiquitous AI, there's certainly no dearth of ethical and legal questions being raised. But there's another worrying aspect of the technology that's received far less attention: its environmental impact, Wired reports.
"There are already huge resources involved in indexing and searching internet content, but the incorporation of AI requires a different kind of firepower," Alan Woodward, a cybersecurity expert at the University of Surrey, told the magazine.
"It requires processing power as well as storage and efficient search. Every time we see a step change in online processing, we see significant increases in the power and cooling resources required by large processing centers," he added. "I think this could be such a step."
University of Coruña computer scientist Carlos Gómez-Rodríguez told Wired that training the large language models (LLMs) that the likes of ChatGPT run on is so prohibitively resource intensive that essentialy "only the Big Tech companies can train them."
Unfortunately, none of them — namely Microsoft and Google — have publicly disclosed how much computational power they're consuming to get their chatty AIs off the ground. But an independent analysis cited by Wired found that training OpenAI's GPT-3 model (which ChatGPT runs on) consumed 1,287 megawatt hours, which the outlet compared to "the same amount as a single person taking 550 roundtrips between New York and San Francisco."
On its own, not that bad. But consider that "not only do you have to train it," Gómez-Rodríguez says, "but you have to execute it and serve millions of users."
Or billions, in fact, now that Bing and Google are gearing up to serve the technology to their global user bases.
Let's be fair, though. How does an AI search compare in efficiency to the old fashioned search engines we use now?
"At least four or five times more computing per search," estimates Martin Bouchard, co-founder of the sustainable data center company QScale, to Wired.
At scale, that could be very substantial. Bouchard notes that the computing demands will only get worse when the AIs are eventually trained on current data and not merely datasets from a snapshot in time.
"Current data centers and the infrastructure we have in place will not be able to cope with [the race of generative AI]," he added. "It's too much."
More on generative AI: Microsoft Is Apparently Discussing ChatGPT's Bizarre Alternate Personality