Articles entire thesis looks like it can be completely de-railed if one activity happened: ai infrastructure firms cease to be able to secure more capital.
Is that likely? History says it's inevitable, but timeframe is an open question.
> ai infrastructure firms cease to be able to secure more capital
If this does occur, unfortunately it isn’t like any of the production capacity is going to immediately shift or be repurposed. A lot of the hardware isn’t usable outside of datacenter deployments. I would guess a more realistic recalibration is 2-3 years of immense pain followed by gradual availability of components again.
yeah 3 years sounds reasonable to me, less than one asset depreciation cycle in business. Pain for you and me, but just a bump in the road for the accounts dept.
My computer, and I think all threadripper systems, has registered ECC DDR5 RAM which I think is the same type used in AI datacenters. Well one half of it, the other half being HBM memory used on video cards, which is soldered to them and non-upgradeable. But the main system memory from a used AI server can become your main system memory.
So that becomes the next question -- will we see an ecosystem of modifications and adapters, to desolder surplus and decommissioned datacenter HBM and put it on some sort of daughterboard with a translator so it can be used in a consumer machine?
Stuff like that already exists for flash memory; I can harvest eMMC chips from ewaste and solder them to cheaply-available boards to make USB flash drives. But there the protocols are the same, there's no firmware work needed...
The thesis wouldn't be "completely derailed", just slightly delayed. The reasons why the powers above are pushing for that dystopian model aren't contingent on AI. If it all went away, we'd have a surge of hardware availability and a drop in prices, followed by the same trends - a slow transition to 'cheaper' remote computing wearing down the more expensive custom PC market, higher prices further reducing demand and creating a spiral until people who want personal computing are a niche market segment that becomes almost extinct. The result is still the same. Everyone will be using thin clients or computers that are more like smartphones or Chromebooks than modern PCs, with most services provided through the tightly-regulated internet via subscription services. It just would take us more time to get there.
The AI bubble feels like it's already starting to burst. Sooner or later the market will see that AI has improved productivity by 20% (or whatever figure) and realise it's much less than the 1000% improvement they've been pricing in.
Is that likely? History says it's inevitable, but timeframe is an open question.