The $650 billion question is not whether the US has enough money to build the AI infrastructure it has announced. It clearly does. The question is whether it has enough transformers.
Bloomberg’s April 2026 investigation identified the specific, unglamorous chokepoint threatening to slow the AI boom: high-power electrical transformers — the devices that step grid-voltage electricity down to usable levels inside data centers — are in critical short supply. Delivery times have stretched from 24–30 months before 2020 to as long as five years today. AI data centers need to be built and operational in under 18 months to keep pace with compute demand. The math does not work. Sightline Climate, a firm that tracks data center infrastructure, concluded that between 30% and 50% of all US data center projects scheduled to open in 2026 will be delayed or canceled outright.
This is the real infrastructure crisis of 2026. Not the chip shortage. Not regulatory blockages. Transformers.
Direct Answer: Why are US AI data centers being delayed in 2026? Between 30% and 50% of US data center projects planned for 2026 will be delayed or canceled, according to Sightline Climate’s Data Center Outlook. The primary cause is a severe shortage of high-power transformers, switchgear, and batteries — the electrical equipment required to connect new data centers to the grid. Transformer delivery times have stretched to up to 5 years, while data center build cycles run 12–18 months. The US imports over 40% of its high-power transformers from China, and new tariffs are raising costs without creating domestic alternatives. The hyperscalers (Alphabet, Amazon, Meta, Microsoft) are largely protected by pre-negotiated supply agreements. Smaller operators, regional cloud providers, and new entrants bear the brunt of the delays.
The Numbers That Tell the Story
Sightline Climate’s 2026 Data Center Outlook tracked 190 GW of announced data center capacity across 777 projects globally since 2024. For the US in 2026 specifically:
- 12–16 GW of capacity is scheduled to come online in the US in 2026 across approximately 140 projects
- Only ~5 GW is actually under construction — roughly one-third of the planned total
- 30–50% of 2026 projects are expected to be delayed or canceled
- A further 16 GW sits in announced status with no construction activity on the ground
- 37 GW of announced infrastructure globally has not received a firm completion date, with only 4.5 GW of that having begun work
The 2027 pipeline is worse. More than 25 GW of US data center capacity is scheduled for that year, but less than 10 GW is currently being built. The gap between what has been announced and what is actually under construction is not narrowing — it is widening.
This is not a financial problem. Alphabet, Amazon, Meta, and Microsoft are projected to spend a combined $650 billion on AI infrastructure in 2026 alone. The money exists. The electrical equipment does not.
Why Transformers Are the Chokepoint
A data center is fundamentally a power consumption facility. The compute hardware — GPUs, CPUs, memory — is secondary to the electrical infrastructure that feeds it. A data center without a functioning power connection is a building full of dormant metal.
What transformers do:
High-power transformers step high-voltage electricity from the transmission grid — typically delivered at 115 kV to 500 kV — down to the lower voltages that data center equipment can use. A single large hyperscale data center requires multiple large power transformers, each weighing 100–400 tonnes and costing $3–10 million. The transformers connect to switchgear and distribution systems that further condition and route power through the facility.
Why they take so long to build:
High-power transformers are bespoke industrial equipment. Each unit is custom-engineered for its specific application. The primary winding, core laminations, and insulation systems are built to order. The manufacturing process involves precision winding of copper conductor around a grain-oriented silicon steel core, followed by vacuum impregnation with insulating oil, testing, and commissioning. It cannot be accelerated the way a semiconductor fab can be scaled with capital investment.
Before 2020, the standard lead time for a large power transformer in the US was 24–30 months. By 2025, demand surges from AI infrastructure, EV charging networks, renewable energy installations, and electrified heating systems had already stretched lead times to 3–4 years. By April 2026, transformer manufacturers report lead times of up to 5 years for some configurations.
The AI data center deployment cycle runs 12–18 months. A hyperscaler that decides to build a data center today and orders transformers today cannot have those transformers delivered before the building is supposed to be operational. For any project that did not place transformer orders years in advance, the build schedule is impossible.
The China Dependency That Tariffs Cannot Solve
Here the story acquires its sovereignty dimension — and its painful irony.
The United States imports more than 40% of its high-power transformers from China. The two dominant Chinese suppliers — TBEA Co. and China XD Group — are state-backed enterprises with order books filled through 2027. China also accounts for over 40% of US battery imports and near 30% of switchgear components.
The Trump administration’s tariff programme was explicitly designed to reduce this dependency. The April 2026 expansion of steel, aluminium, and copper duties placed a 50% tariff on copper — a core component of transformer windings. The effect is not reduced China dependency. The effect is that importing Chinese transformers now costs substantially more, while domestic alternatives do not exist at the required scale.
The domestic gap:
Despite a decade of reshoring rhetoric, US manufacturing capacity for large power transformers remains critically insufficient. The National Infrastructure Advisory Council flagged this as a critical infrastructure risk in its June 2024 report and recommended establishing a strategic national reserve of transformers — a recommendation that was not acted upon before the current shortage materialised.
Major manufacturers are investing, but on timelines that do not help 2026:
- GE Vernova — dominant in US transformer production after acquiring Prolec, expanding North American capacity, but backlogged
- Siemens Energy — committed more than $1 billion to US grid infrastructure including a new large power transformer plant in Charlotte, North Carolina, targeting 2027 production start
- Hitachi Energy — more than $1 billion in North American investments including a new large power transformer facility in Virginia, expected to be the largest in the US by 2028
- Eaton — investing hundreds of millions in new US transformer and switchgear facilities
The earliest meaningful domestic capacity relief comes in 2027–2028. The data center projects that need transformers in 2026 have no domestic solution.
Import diversification attempts:
To address the China shortfall, companies are sourcing from Canada, Mexico, and South Korea — which have become the largest alternative suppliers of high-power transformers to US data centers. Chinese imports have surged simultaneously: high-power transformer imports from China rose from fewer than 1,500 units in 2022 to more than 8,000 units through October 2025, according to Wood Mackenzie data. The US is importing more from China even as it tries to reduce dependency, because there is no alternative that can match Chinese capacity at current demand levels.
Who Is Protected and Who Is Exposed
Not all data center operators face the same risk. The shortage has a clear winner and a clear loser.
Protected: The hyperscalers
Alphabet, Amazon, Meta, and Microsoft saw this coming. They placed long-term, direct procurement agreements with transformer and electrical equipment manufacturers years in advance of current demand. They did not rely on spot market availability. Their facilities — which represent a disproportionate share of the 5 GW currently under construction — are largely insulated from the shortage.
This is the same playbook the hyperscalers used in the chip market. They locked in Nvidia GPU supply through multi-year agreements before the shortage became public. They signed advance power purchase agreements with utilities before energy became the AI bottleneck. Now they have done it again with electrical equipment. Each time, they gained structural advantage over competitors who could not access or could not afford the same forward planning.
Exposed: Everyone else
The 30–50% of 2026 projects facing delay or cancellation are disproportionately:
- Independent colocation providers (Equinix, Digital Realty, and smaller operators)
- Regional cloud platforms attempting to compete with AWS, Azure, and GCP
- New entrants — AI infrastructure startups, specialised GPU cloud providers — that emerged in the past 18 months in response to compute demand
- Enterprise operators building private AI infrastructure without the procurement scale to secure supply
For the independent colocation market specifically, the consequence is prolonged pricing pressure. The supply of new colocation capacity was supposed to ease pricing power for hyperscalers’ enterprise customers in 2026. That supply is not arriving on schedule. Enterprise buyers of cloud and colocation services can expect pricing to remain elevated longer than forecast.
The Grid Stress That No One Planned For
The transformer shortage sits atop a broader energy infrastructure crisis that the AI boom has exposed.
US electricity grids were designed for a consumption profile that predated AI data centers, electric vehicles, and building electrification simultaneously competing for capacity. The grid is stressed at multiple layers simultaneously:
Generation: AI data centers consume electricity at a scale that requires new power plants. A single large hyperscale campus can consume 100–500 megawatts — equivalent to a small city. Planned US data center capacity represents tens of gigawatts of new load. Utilities were not planning for this demand three years ago.
Transmission: Moving electricity from generation to data center sites requires transmission infrastructure that takes years to permit and build. Interconnection queues at US utilities have expanded dramatically, with some projects waiting 5–7 years for grid connection approval.
Distribution: The transformers and switchgear that step power down to usable levels within the data center are the specific shortage this article addresses. This is the last mile of the power delivery chain — and it has become the limiting constraint.
Community opposition: Sightline Climate explicitly cites “increasingly effective community opposition” as a driver of project delays alongside equipment shortages. Data centers consume enormous amounts of power, produce heat, require large water supplies for cooling, and generate round-the-clock traffic and noise. Local communities — particularly in rural areas targeted for land availability — are organising against approvals with growing effectiveness.
Wisconsin’s recent anti-data center vote is cited in multiple analyses as a signal that community resistance has moved from occasional objection to an organised, replicable opposition strategy.
The Sovereign Infrastructure Angle
From Vucense’s perspective, this story has implications beyond the AI industry’s internal logistics.
Cloud concentration compounds:
The transformer shortage is accelerating the market structure that many sovereignty advocates consider most problematic: total dominance by four hyperscalers. The operators who survive this bottleneck intact are exactly the ones who were already largest. The operators who cannot absorb delays, cannot hold transformer supply agreements, and cannot negotiate multi-year power purchase agreements are exactly the ones who would provide competitive alternatives to AWS, Azure, and GCP.
A data center market where Amazon, Microsoft, Google, and Meta control an even larger share of capacity than they do today is a market where sovereign alternatives — European cloud providers, regional cloud operators, enterprise private infrastructure — face steeper headwinds.
US-China infrastructure dependency:
The AI boom’s dependence on Chinese electrical equipment is a strategic vulnerability that the tariff war has highlighted without resolving. The US cannot build its AI infrastructure at scale without Chinese transformers, cables, and batteries. Imposing tariffs on those components raises costs and creates political friction without creating domestic supply. The National Infrastructure Advisory Council’s 2024 recommendation for a strategic transformer reserve was the correct policy response — and it was not implemented.
This is the infrastructure equivalent of the chip supply chain concentration risk that drove the CHIPS Act. The difference is that transformer manufacturing cannot be rebuilt as quickly as semiconductor fabs. Transformers are artisanal industrial objects. A new fab takes 3–4 years to build. A new transformer manufacturing facility takes 5–7 years to reach meaningful capacity, requires a skilled workforce that the US has allowed to atrophy, and faces supply chain dependencies of its own for core steel and copper.
What sovereign infrastructure means at the energy layer:
Countries and regions that control their power infrastructure — including the manufacturing of the electrical equipment that connects new capacity to the grid — have a genuine advantage in the AI era. Germany and Japan, which have maintained substantial industrial manufacturing bases including electrical equipment production, are in structurally stronger positions than the US on this dimension. France’s digital sovereignty initiative is notable for covering not just operating systems and cloud providers but the full infrastructure stack — including power and network equipment.
The lesson of 2026’s transformer crisis is that AI sovereignty is an energy question before it is a software question. Whoever controls the power is closer to controlling the compute.
What the Next Two Years Look Like
2026: Delays compound. Independent operators push project timelines into 2027. Hyperscalers deploy on schedule. Pricing power remains with AWS, Azure, and GCP for enterprise customers who expected colocation alternatives to be available.
2027: Siemens and Eaton new facilities begin producing. The South Korean and Canadian supply chains grow. Chinese import volume continues to rise despite tariffs because there is no domestic substitute. The Wisconsin precedent is tested in other communities — Wisconsin-model opposition either spreads or is contained by state preemption legislation.
2028: Hitachi’s Virginia facility reaches operational scale. Domestic transformer supply meaningfully improves. The hyperscalers’ 2027 pipeline begins arriving. But by 2028, the AI infrastructure demands of 2029 and 2030 are already being planned, and the same cycle risks repeating unless the transformer reserve and domestic manufacturing investment happens in 2026–2027.
The President’s National Infrastructure Advisory Council recommended the strategic transformer reserve in June 2024. As of April 2026, it has not been implemented. The AI boom will not wait for the policy apparatus to catch up.
FAQ
Why are US data centers being delayed when companies are spending $650 billion on AI infrastructure? The money exists but the electrical equipment does not. High-power transformers — which connect data centers to the power grid — now take up to 5 years to deliver, while data centers need to be built in 12–18 months. No amount of capital can compress a transformer’s manufacturing timeline. Companies that did not pre-order transformers years ago cannot build on schedule.
How many data centers are affected? Sightline Climate estimates 30–50% of US data center projects scheduled for 2026 will be delayed or canceled. Of the 12–16 GW of US capacity planned for this year, only approximately 5 GW is actually under construction.
Why does the US depend on China for transformers? US industrial manufacturing capacity for large power transformers has declined over decades. China has built the world’s largest industrial manufacturing base for electrical equipment. The two dominant Chinese suppliers — TBEA and China XD Group — are state-backed enterprises with order books full through 2027. No US or allied alternative at equivalent scale exists yet.
Do the tariffs help? No, in the short term. Tariffs raise the cost of importing Chinese transformers but do not create domestic alternatives — those take 5–7 years to reach capacity. The 50% tariff on copper (a core transformer component) raises costs without changing supply. The net effect is more expensive Chinese imports, not fewer Chinese imports.
Which companies are most at risk? Independent colocation providers, regional cloud platforms, and new AI infrastructure entrants are most exposed. The hyperscalers (Alphabet, Amazon, Meta, Microsoft) locked in supply agreements years in advance and are largely protected.
What should organisations planning cloud or data centre decisions do? Assume new colocation capacity arrives later than advertised. Pricing for existing capacity will remain elevated longer than forecast. If you are planning significant infrastructure decisions in 2026–2027, model for constrained availability and price premium in colocation and wholesale cloud markets. Hyperscaler contracts are more reliable on infrastructure availability — a counterintuitive advantage during a physical supply shortage.
Related Articles
- TSMC Q1 2026: $35.7B Record Revenue — The AI Chip Chokepoint That Controls Everything
- Big Tech Is Buying Nuclear Power for AI Data Centres — Microsoft, Google, Amazon Bet on SMRs
- Amazon Trainium vs NVIDIA: Why AI Labs Are Switching (2026)
- AI Chip Shortage: Broadcom, TSMC and the Foundry Sovereignty Problem
- OpenAI Pauses Stargate UK: Energy Costs and Regulation Kill the £Billion Data Centre
- Why PC Hardware Prices Are Skyrocketing in 2026
Sources & Further Reading
- MIT Technology Review — AI Section — In-depth coverage of AI research and industry trends
- arXiv AI Papers — Pre-print research papers on AI and machine learning
- EFF on AI — Civil liberties perspective on AI policy