Even as tariffs loom, big tech companies still plan to splash tons of money on AI infrastructure. But the return they're getting on these moonshot investments should give you pause.

In numbers, Microsoft, Meta, Google, and Amazon combined will spend more than $270 billion in capital expenditures to build AI data centers in 2025 alone, according to a Citigroup estimate cited by The Wall Street Journal.

But looking at Amazon as a case study, just a fraction of that centibillion sum will return revenue. John Blackledge, a tech analyst at TD Cowen, told the WSJ that the cloud computing division of e-commerce giant, Amazon Web Services, has historically earned $4 in incremental revenue for every $1 spent. With generative AI, the ratio is inverted: around 20 cents for every dollar, according to Blackledge, who nonetheless maintains that within a few years, Amazon will get closer to a $4 return.

It's difficult to extrapolate a specific amount that's being spent on AI, because Amazon doesn't break down its specific spending. But last year, AWS reported an operating income of $39.8 billion, and it's been a cash cow long before the AI boom.

That AI is a money pit shouldn't be surprising. It's been clear for some time now that generative AI's road to profitability will be a rocky one, if it even leads to that destination at all. As evidenced by the above figures, it's an enormously expensive technology, both to develop and to run. Vast amounts of data are scraped to train the AI models, and the pricey data centers that run them rack up huge energy bills, water bills, and maintenance costs even after they're constructed.

The other side of the equation — how the AI services will generate money — is also fraught with uncertainty. OpenAI, which is charging customers an eye-watering $200 per month for full access to ChatGPT, for example, reeks of desperation. Power users may think it worthwhile, but most people might have second thoughts about paying even just a $20 subscription fee.

Above all, big tech hasn't convinced customers and some of its own investors on what AI will be out-and-out useful for. Chatbots are great at writing emails, but many attempts to implement AI models into business settings, such as customer service roles, have fallen flat on their face. Evincing an industry bereft of new ideas or clear direction, Apple's stab at the technology has been lackluster — a tool to generate custom emojis does not suggest at a company at the height of innovation — being a dud at best and a disaster at worse. Meanwhile, serious questions about the tech's proclivity to hallucinate, or generate false information, persist.

Despite talking a big game, some in the tech sector are showing cold feet amidst the tariffs. Microsoft, which still insists it plans to spend $80 billion in AI infrastructure this year, signaled as much by scaling back or pulling out of numerous data center projects across the globe.

Whether Amazon will blink remains to be seen. It's indicated it will spend a total of $100 billion this year in capital expenditures, with the majority of that going towards AWS. In its splashiest move yet, Amazon is building one of the world's largest data center clusters, codenamed Project Rainier, for Anthropic AI to train its models on with hundreds of thousands of custom-built "Trainium 2" chips, achieving five times the processing power used to train current leading AI models. Nonetheless, Amazon is keeping the specific costs associated with the project a closely guarded secret.

More on AI: Quartz Fires All Writers After Move to AI Slop


Share This Article