The technology industry has invested about $400 billion in specialized AI processors and data centers this year, but there are growing warnings that these investments are based on overly optimistic assumptions. The key question is: How long can these chips really last before they become obsolete?
Before the AI boom, large cloud providers calculated that server processors could be used for about six years. Today, analysts warn that this is no longer realistic. AI chips operate under enormous thermal stress and wear out faster, irreversibly degrading performance, and new generation development is happening at a pace the industry has never seen before.
Nvidia, the absolute market leader, has announced that the Ruby chip will arrive in 2026 and be 7.5 times faster than the Blackwell model, which was introduced less than a year earlier. According to the analysis of the DA Davidson company, this pace of development means that AI chips lose 85 to 90 percent of their market value within three to four years.
AI chips are losing value much faster than the industry expected
At the same time, the tax rate is also rising. In an internal infrastructure report, Meta found an annual failure rate of up to 9 percent for Llama models, which is much higher than previous generations of GPU systems.
If it turns out that the chips actually last only two to three years, it practically means that companies are operating at “artificially low” costs. If they had to shorten the depreciation period, profits would drop sharply, and some companies would find themselves in a financing crisis.
The biggest risk is not Amazon, Google or Microsoft, which have diverse sources of income, but companies like Oracle and CoreWeave, which are deeply indebted as they rapidly buy chips to survive in the cloud services market. An additional problem is that some credit lines use the processors themselves as collateral, which is risky when the hardware loses value so quickly.
The industry is trying to mitigate the consequences by selling older chips or repurposing them for less demanding AI tasks, but analysts warn that this will not be enough if the obsolescence cycle is further shortened.
In an era where the entire economy is increasingly reliant on generative artificial intelligence, the unsustainable duration and rapid devaluation of AI processors is becoming a problem that could seriously shake today’s tech boom, reports MSN.