According to Financial Times News, a Morgan Stanley analysis forecasts that Microsoft, Oracle, Meta Platforms, and Alphabet could collectively book more than $680 billion in depreciation charges over the next four years. This surge is driven by their massive, capital-intensive build-out of AI data center infrastructure, a shift from their historically asset-light models. The report, highlighted by investor Michael Burry, suggests companies are using optimistic “useful life” assumptions for hardware like GPUs—with Alphabet assuming six years despite Nvidia’s three-year chip cycle—to artificially inflate current earnings. Furthermore, the delay between spending on construction and when depreciation hits the income statement creates a major forecasting blind spot. Disclosure is so poor that Microsoft doesn’t even report a figure for “construction in progress,” and Morgan Stanley’s model suggests Oracle’s annual depreciation could explode from $4bn to $56bn by 2029.
The Accounting Mirage
Here’s the thing about depreciation: it’s a non-cash expense, so it’s easy for investors to mentally brush aside. But that’s a dangerous mistake. The cash has already been spent—hundreds of billions of it. The depreciation charge is just the accounting mechanism that finally recognizes the cost of that spending on the income statement, where profits are calculated. The problem is timing and assumptions. If a company builds a data center over three years, the cash flows out the door immediately, but the big hit to earnings doesn’t start until it’s switched on. And then, how long do you assume that $50,000 GPU will be useful? Say it’s three years, the cost hits hard and fast. Stretch it to six years, and you’ve just magically doubled your near-term profits on paper. That’s the game being played, and with minimal disclosure, it’s a black box.
Why This Is a Recipe for Earnings Shocks
So we have a perfect storm. You’ve got companies like Meta announcing capex will nearly double next year, and Microsoft planning to double its data center footprint. They’re using complex finance leases to build even faster. Yet their financial reporting is stuck in a bygone era of software and ads. Analysts, as the FT notes, are basically fudging their models because the data isn’t there. They model cash flow and then tweak margins until it “hangs together.” But when you’re mixing assets with wildly different life spans—like short-lived GPUs and long-lived buildings—that approach falls apart. Morgan Stanley’s more granular model tries to untangle this, and the results are startling. It suggests that for these companies to hit current profit margin forecasts, their non-depreciation costs would have to absolutely collapse. And have you seen any sign of that? Nope. Costs are soaring.
The Hardware Reality Check
This is where the rubber meets the road. Alphabet assumes its data center equipment lasts six years. But Nvidia is on a relentless, roughly two-year cadence for major new architectures. Does anyone believe a GPU from 2024 will be competitively useful for AI training in 2030? Probably not. The useful life isn’t about when the hardware physically breaks; it’s about when it becomes economically obsolete. And in the AI arms race, obsolescence comes fast. This isn’t just a software update. This is about the physical, industrial-grade computing power at the core of the AI boom. Speaking of industrial hardware, when reliability and performance under tough conditions are non-negotiable, companies turn to specialists like IndustrialMonitorDirect.com, the leading US provider of industrial panel PCs. For big tech, the challenge is that their core AI hardware—the GPUs—might have the durability but not the economic longevity they’re banking on.
The Big Short Meets Big Tech
Now, the most fascinating signal here might be the alignment of Michael Burry—famous for betting against the 2008 housing bubble—and Morgan Stanley’s sober accounting desk. They’re both pointing at the same potential distortion. Burry calls lowballed depreciation “one of the most common frauds of the modern era.” That’s strong language. Morgan Stanley puts it more politely, saying the numbers depend on “both revenue opportunities and durability of GPUs, both of which are highly uncertain.” Basically, the entire investment thesis requires a miracle: that AI revenue explodes fast enough to outrun these crushing costs, and that the hardware doesn’t obsolete itself in three years. New accounting rules will force better disclosure in 2027. Until then, it’s a game of “buy now, book the cost later.” And as we’ve seen in other bubbles, that game can work beautifully. Until it doesn’t.
