In pursuit of the AI dream, the tech industry this year has plunked down about $400 billion on specialized chips and data centers, but questions are mounting about the wisdom of such unprecedented levels of investment.
At the heart of the doubts: Overly optimistic estimates about how long these specialized chips will last before becoming obsolete.
With persistent worries of an AI bubble and so much of the U.S. economy now riding on the boom in artificial intelligence, analysts warn that the wake-up call could be brutal and costly.
"Fraud" is how renowned investor Michael Burry, made famous by the movie "The Big Short," described the situation on X in early November.
Before the AI wave unleashed by ChatGPT, cloud computing giants typically assumed that their chips and servers would last about six years.
But Mihir Kshirsagar of Princeton University's Center for Information Technology Policy says the "combination of wear and tear along with technological obsolescence makes the six-year assumption hard to sustain."
One problem: chip makers — with Nvidia the unquestioned leader — are releasing new, more powerful processors much faster than before.
Less than a year after launching its flagship Blackwell chip, Nvidia announced that Rubin would arrive in 2026 with performance 7.5 times greater.
At this pace, chips lose 85 to 90 percent of their market value within three to four years, warned Gil Luria of financial advisory firm D.A. Davidson.
Nvidia CEO Jensen Huang made the point himself in March, explaining that when Blackwell was released, nobody wanted the previous generation of chip anymore.
"There are circumstances where Hopper is fine," he added, referring to the older chip. "Not many."
AI processors are also failing more often than in the past, Luria noted.
"They run so hot that sometimes the equipment just burns out," he said.
A recent Meta study on its Llama AI model found an annual failure rate of 9 percent.
For Kshirsagar and Burry alike, the realistic lifespan of these AI chips is just two or three years.
Nvidia pushed back in an unusual November statement, defending the industry's four-to-six-year estimate as based on real-world evidence and usage trends.
But Kshirsagar believes these optimistic assumptions mean the AI boom rests on "artificially low" costs — and consequences are inevitable.
If companies were forced to shorten their depreciation timelines, "it would immediately impact the bottom line" and slash profits, warned Jon Peddie of Jon Peddie Research.
"This is where companies get in trouble with creative bookkeeping."