The AI electricity demand shortage is not a hypothetical risk on a five-year horizon — it is an engineering constraint already limiting deployment of hardware that has been ordered, paid for, and delivered.
Nvidia GPUs are sitting in warehouses because the data centers to house them don’t have power. The data centers don’t have power because transformer lead times from Siemens and ABB are running at five years. That backlog exists because the industrial capacity to manufacture large power transformers was allowed to atrophy during decades when nobody was building large-scale electrification infrastructure.
Craig Tindale made this point with force in his Financial Sense interview. The AI narrative has been built almost entirely on the financial ledger: compute investment, model capability, revenue projections. The material ledger — the copper, the transformers, the electrical infrastructure — has been largely ignored. That asymmetry is now producing visible bottlenecks that no amount of capital can resolve on a short timeline.
China’s position is instructive by contrast. China has three times the electrical generating capacity of the United States and is expanding at a rate that dwarfs Western grid investment. The AI race is not just a race for compute. It is a race for the physical infrastructure that powers compute — and on that dimension, China is winning in slow motion.
The picks-and-shovels play of the AI era that nobody is talking about: grid infrastructure companies, electrical equipment manufacturers, and energy generation assets positioned at the exact bottleneck of the most capital-intensive technology buildout in history.