“It’ll happen slowly, and then all at once.”
Freed from the obligations of reporting and client management, Burry returned to X with a message that cut straight through the current AI euphoria. To him, the boom in GPUs, data centers, and trillion-dollar AI bets isn’t evidence of unstoppable growth; it’s evidence of a financial cycle that looks increasingly distorted, increasingly crowded, and increasingly fragile.
Meta did not respond to a request for comment. Oracle declined to comment.
“He’s spot on,” Morrow tells Fortune. Morrow has been making the arguments for months, warning that a “tsunami of depreciation” could quietly flatten Big Tech’s AI profits. Behind the trillion-dollar boom in chips, data centers, and model training, he argues, lies a simple but powerful illusion: companies have quietly changed the length of time they account for their machines—and their semiconductor chips—wearing out and depreciating.
“Companies are very aware of it,” Morrow claims. “They’ve gone to great lengths to change their accounting and depreciation schedules to get ahead of it—to effectively avoid all this capex hitting their income statements.”
“These aren’t small numbers—they’re huge. And the fact that someone like Burry is calling it out tells you people are starting to notice what’s happening between the lines of the balance sheet.”
That simple change lets them spread out their costs and report fatter earnings now.
“Had they not made those changes,” Morrow says, “their earnings would be dramatically lower.”
In other words: depreciation was a major cost already, and management clearly chose to stretch the timeline over which those costs are recognized. The policy change doesn’t prove Burry’s dollar totals across firms, but it markedly moves reported earnings in the direction he describes by lowering near-term depreciation expense and pushing more of it into later years.
To be sure, Morrow’s expertise is in value investing and high dividend companies, as opposed to the trendier, high-growth tech stocks: in fact, he said he has no long positions in technology broadly. So, he benefits if Big Tech multiples compress or if the markets begin to reprice the costs buried in AI spending. Still, his critique aligns with growing unease from other analysts.
Richard Jarc, an analyst at Uncovered Alpha, has raised similar alarms about the mismatch between AI chip lifecycles and corporate accounting.
By The Economist’s estimates, if those assets were depreciated over three years instead of the longer timelines companies now assume, annual pre-tax profits would fall by $26 billion, roughly an 8% hit. A two-year schedule would double that loss, and if depreciation truly matched Nvidia’s pace, the implied market value hit could reach $4 trillion.
Not everyone buys the doom loop. In a note to clients sent this week, Bank of America’s semiconductor team argued that the market’s sudden skepticism about AI capex is evidence that the trade is far less overcrowded than credits claim.
The recent selloff in megacap AI names, the team led by Vivek Arya wrote, was driven by “correctable macro factors”—shutdown jitters, weak jobs data, tariff confusion, even misinterpreted OpenAI comments—rather than any real deterioration in AI demand. In fact, the firm pointed to surging ancillary segments like memory and optical (up 14% last week), as well as Nvidia’s disclosure of $500 billion-plus in 2025–26 data-center orders, as signs the underlying spending cycle remains “robust.”
But Morrow is most worried that investors are mistaking raw spending for real growth. The market, he argues, have stopped distinguishing between capital intensity and genuine productivity. The AI boom has pushed valuations from the realm of software multiples into something closer to industrial-scale infrastructure math. Building just one hyperscale, one-gigawatt data center—enough to support a cutting-edge AI model—can run around $50 billion, with the majority devoted to GPUs, followed by the buildings, cooling systems, and power infrastructure.
Reality check—“none of these companies has ever managed a $50 billion project before,” he says. “Now they’re trying to do fifty of them at once.”
“Every month a $35 billion stack of GPUs sits without power, that’s a billion dollars of depreciation just burning a hole in the balance sheet,” he says. “So of course they’re panicking — and ordering their own turbines.”
He thinks the result will be a massive power glut by the late 2020s: an overbuilt grid, over-leveraged utilities, and ratepayers stuck with the bill.
“We’ve seen this movie before—in shale, in fiber, in railroads,” he says. “Every capital-spending boom ends the same way: overcapacity, low returns, and a bailout.”
The biggest risk, he says, is that investors have stopped looking at balance sheets entirely. He points to the sheer concentration of today’s market: nearly half of all 401(k) money now effectively tied to six megacaps.
“This is the most crowded trade in history,” he says. “When it turns, it’s going to turn fast.”



