There’s a wild paradox in the middle of the biggest story in tech right now. The GPUs and other essential hardware that the hyperscalers are spending so lavishly to pack into their data centers with, it turns out, go obsolete in a hurry. That’s the view detailed in an excellent new report from Research Affiliates, a firm that oversees around $200 billion in investment strategies for the RAFI index funds and ETFs. Author Chris Brightman—he’s RA’s CEO—contends that the AI arms race has effectively created a new industrial era. In this transformed ecosystem, companies aren’t “investing” in the traditional sense. Rather, they’re churning equipment at such an incredibly rapid tempo to generate sales that it’s changing what is even meant by capex.
“They’re more like supermarkets than traditional tech or industrial enterprises, but their turnover isn’t in the likes of grocery items. It’s the stuff that generate their large language models, vector search and other products,” Brightman told me in a phone interview. “They’re in an arms race where they need to replace their hardware very rapidly, in other words, restock their shelves in a hurry.” The problem, Brightman asserts, is that hyperscalers are taking losses on the large language models, vector databases and other products they’re selling to companies and consumers, so the more hardware they buy, the more money they lose. “Right now, each is using AI to maintain crucial dominance in their field, and that makes sense.” Brightman observes. But, he adds, the immense spending needed to maintain those “moats” and keep rivals at bay could generate puny returns going forward, and harm their overall profitability.
In the article, Brightman spotlights the historic surge in AI capex that’s mushroomed from $250 billion in 2024 to $650 billion this year by Bloomberg’s estimate, equal to 2% of GDP. That industry’s historic appetite for capital spawned the view that AI’s becoming the new steel or railroads. But as Brightman points out, the equipment and infrastructure that supported those businesses is far different from the gear that drives AI. “Steel mills and rail tracks depreciated over 40 to 45 years,” he writes. He then contrasts those multi-decade useful lives to the scenario in AI. Hyperscalers such as Microsoft, Amazon, Alphabet and Meta are depreciating their GPUs and other hardware over roughly 5 or 6 years on their income statements. Although those spans appear short, he says, their real “lives” are much shorter.
In an economic sense, assets become fully depreciated, or turn obsolete, when the revenues they generate no longer cover their cost of acquisition (reflected in yearly depreciation), operating expense, and cost of capital. According to Brightman, the industry numbers show that AI hardware loses its value over about three years. As proof, he cites data on the profitability of Nvidia’s industry-standard H100 GPUs. In their second year, a H100 spawned $36,000 in annual profit for a 137% return on investment. But by year four, the product was losing over $4,400 for a negative ROI of 34%, and the results sank fast from there. Writes Brightman, “The economic life of AI hardware is [a lot] shorter than its accounting life.”
It’s not that the equipment wears out. Physically, it can actually run a lot longer. The reason AI hardware lose potency so fast: Nvidia, AMD and the other producers are crafting fresh offerings that each year provide enormous increases in computing power per watt deployed. Since the hyperscalers face tough energy constraints, they’re constantly seeking gobs of new “compute” using dollops of extra electricity. Normally, if typical manufacturers were adding capital at the pace the hyperscalers are setting in AI, they’d already have built a gigantic base of equipment and infrastructure they could deploy for years, without the need to keep buying more. Not so in this brave new business. AI equipment is evolving so fast that each year, the hyperscalers need to replace an immense part of their capital base just to maintain the same capacity for forging AI wonders. “Most of their spending isn’t growth capex, it’s ‘maintenance’ capex,” says Brightman. Nevertheless, the overall numbers are so huge that although only about one-third goes to expansion, that’s still good enough to hugely grow the volume of products and services they can deliver each year.
The hyperscalers are using AI, and taking big losses, chiefly to protect their turf
In our phone calls, Brightman nailed the conundrum for the giants of AI. “As they ramp the compute, they lose more and more money,” he says. “But they have plenty of rationale to do so for now.” All of the Big Four aim to provide the best AI features to enhance their signature offerings, and recognize that they’ll lose their leadership in those staples if the AI component isn’t top notch. Amazon makes most of its money providing computations and storage in the cloud. It’s unable to recoup nearly the cost of the AI additions from its customers, says Brightman. “But it’s sensible because if Amazon doesn’t stay in the arms race, they’ll lose the cloud business. They need the AI services as part of the cloud component.”
As for Microsoft, its staple is office software that generates subscription revenues, notably on its 360 platform. That franchise now faces stiff competition from Google’s docs and sheets products. “To protect its existing business and keep its customers, Microsoft has to offer AI model services, even if it’s losing money on its AI capex,” declares Brightman. Alphabet is pre-eminent in “search,” and cleans up as the world’s biggest seller of online ads. Microsoft has mounted a challenge by launching its own search engine. “To continue its profitable line of business and keep its edge, Alphabet needs the AI element, and that requires big investments in data centers,” says Brightman.
Meta’s got to worry about the other three invading its highly-lucrative, social media advertising business. “People come to their platform to see the pictures and the video, and it costs Meta a lot of money to produce that content that supports the ads,” notes Brightman. Meta uses AI to personalize feeds for users, rank content on instagram and Facebook, and check postings for safety, and needs those uses to maintain its lead. Yet once again, says Brightman, it can’t yet charge enough for its ads to pay for its gigantic new spending needed to provide those fantastic features.
Brightman concludes that the gusher in AI investment doesn’t mean that this revolutionary advance will prove a big profit spinner for the Big Four. It’s more a weapon for each titan to defend its domain. “When capital turns over rapidly, and competition forces continuous reinvestment, extraordinary spending can sustain competitive position without creating value for shareholders,” he states in the article. Once again, the shelf life of this what’s filling our data centers is so brief that buying GPUs, say, is more like replenishing supermarket stocks than building a factories that endure for decades.
On the other hand, Brightman told me that stuff that’s costing these champions big time helped him greatly in preparing his analysis. “A year ago, this project would have taken me nine months to do the research and modeling. But I used the best of Claude, ChatGPT, and Gemini, and synthesized their feedback, and did it start to finish in three weeks,” he recounts. Brightman’s vignette tells the story. This new industrial era may be a lot more beneficial to the folks and businesses that use the AI-enhanced products than the enterprises that furnish them.
