Tracking Wealth Through the AI Lens
So, the AI gold rush continues, and everyone's tripping over themselves to buy the latest and greatest GPUs. But here's the question nobody seems to want to answer honestly: how long is this stuff actually useful? We're talking about multi-million dollar investments here, and the "experts" are all over the place.
Infrastructure giants like Google and Microsoft are saying six years for their AI computers. Six years? Seriously? That's like saying your iPhone 6 is still cutting edge. Michael Burry, the guy who called the 2008 crash, thinks two to three years is more like it. And then you've got CoreWeave, the GPU rental company, sticking with six-year depreciation cycles. Who the hell do we believe? The question everyone in AI is asking: How long before a GPU depreciates? - CNBC
Let's be real, depreciation is just a fancy accounting trick to make things look better (or worse) on paper. Dustin Madsen, VP of the Society of Depreciation Professionals (yes, that's a real thing, apparently), says it's all just a "financial estimate by management." In other words, it's a guess. A guess influenced by "technological obsolescence, maintenance, historical lifespans, and engineering analysis." Sounds scientific, right? But it's still a freakin' guess.
CoreWeave's stock is down 57% from its high in June. Oracle's shares have plummeted 34% from their record high in September. Coincidence? I think not. Investors are starting to wake up and realize that maybe, just maybe, these companies are overspending on tech that's going to be obsolete faster than a fidget spinner.
And then you have Amazon, quietly decreasing the useful life of some of its servers from six years to five. They're blaming it on the "rapid pace of technology development in AI and machine learning." Translation: "We screwed up, and we're trying to cover our asses."

Nvidia is now pumping out new AI chips annually, compared to the old two-year cycle. Jensen Huang even joked that the value of the Hopper would tank when the Blackwell chip starts shipping. He joked about it! It's like he's admitting they're deliberately designing obsolescence into their products. It's genius, if you're Nvidia. For everyone else, it's a giant middle finger.
Satya Nadella over at Microsoft is trying to "space out AI chip purchases to avoid overinvesting in a single generation of processors." Good luck with that, buddy. You're basically admitting you're playing a losing game against Nvidia's release cycle. It's like trying to win a race against a cheetah on roller skates.
CoreWeave claims their Nvidia A100 chips are still fully booked, and a batch of H100 chips were immediately booked at 95% of their original price after a contract expired. Okay, sure. But how long before those H100s are yesterday's news? How long before the next "must-have" chip makes them look like ancient relics? Six years? Give me a break.
Here's the thing: this whole AI boom feels a little too much like the dot-com bubble 2.0. Everyone's throwing money at anything with "AI" in the name, without really thinking about the long-term consequences. And when the music stops, guess who's going to be left holding the bag full of depreciated GPUs?
Are these companies really planning for the long-term, or are they just trying to pump up their stock prices and cash out before the whole thing comes crashing down? I mean, let's be real, how many of these AI "innovations" are actually solving real problems, and how many are just fancy ways to sell us more targeted ads?
This whole "AI revolution" smells like a marketing scheme designed to separate fools from their money. And honestly, I'm starting to feel like the biggest fool of all for even paying attention to it.