AI's $400 bn problem: Are chips getting old too fast?
In pursuit of the AI dream, the tech industry this year has plunked down about $400 billion on specialized chips and data centers, but questions are mounting about the wisdom of such unprecedented levels of investment.
At the heart of the doubts: overly optimistic estimates about how long these specialized chips will last before becoming obsolete.
With persistent worries of an AI bubble and so much of the US economy now riding on the boom in artificial intelligence, analysts warn that the wake-up call could be brutal and costly.
"Fraud" is how renowned investor Michael Burry, made famous by the movie "The Big Short," described the situation on X in early November.
Before the AI wave unleashed by ChatGPT, cloud computing giants typically assumed that their chips and servers would last about six years.
But Mihir Kshirsagar of Princeton University's Center for Information Technology Policy says the "combination of wear and tear along with technological obsolescence makes the six-year assumption hard to sustain."
One problem: chip makers -- with Nvidia the unquestioned leader -- are releasing new, more powerful processors much faster than before.
Less than a year after launching its flagship Blackwell chip, Nvidia announced that Rubin would arrive in 2026 with performance 7.5 times greater.
At this pace, chips lose 85 to 90 percent of their market value within three to four years, warned Gil Luria of financial advisory firm D.A. Davidson.
Nvidia CEO Jensen Huang made the point himself in March, explaining that when Blackwell was released, nobody wanted the previous generation of chip anymore.
"There are circumstances where Hopper is fine," he added, referring to the older chip. "Not many."
AI processors are also failing more often than in the past, Luria noted.
"They run so hot that sometimes the equipment just burns out," he said.
A recent Meta study on its Llama AI model found an annual failure rate of 9 percent.
- Profit risk -
For Kshirsagar and Burry alike, the realistic lifespan of these AI chips is just two or three years.
Nvidia pushed back in an unusual November statement, defending the industry's four-to-six-year estimate as based on real-world evidence and usage trends.
But Kshirsagar believes these optimistic assumptions mean the AI boom rests on "artificially low" costs -- and consequences are inevitable.
If companies were forced to shorten their depreciation timelines, "it would immediately impact the bottom line" and slash profits, warned Jon Peddie of Jon Peddie Research.
"This is where companies get in trouble with creative bookkeeping."
The fallout could ripple through an economy increasingly dependent on AI, analysts warn.
Luria isn't worried about giants like Amazon, Google, or Microsoft, which have diverse revenue streams. His concern focuses on AI specialists like Oracle and CoreWeave.
Both companies are already heavily indebted while racing to buy more chips to compete for cloud customers.
Building data centers requires raising significant capital, Luria points out.
"If they look like they're a lot less profitable" because equipment must be replaced more frequently, "it will become more expensive for them to raise the capital."
The situation is especially precarious because some loans use the chips themselves as collateral.
Some companies hope to soften the blow by reselling older chips or using them for less demanding tasks than cutting-edge AI.
A chip from 2023, "if economically viable, can be used for second-tier problems and as a backup," Peddie said.
E.Mancini--IM