The AI buildout isn't just eating electricity anymore—it's becoming the primary customer for the entire US power generation industry.
The Summary
- US spending on power-generation equipment for data centers will jump from $2.6 billion in 2025 to $65 billion by 2030, a 25x increase in five years driven almost entirely by AI infrastructure
- Data centers may account for up to 40% of total US power equipment investment by 2030, making AI the single largest driver of power infrastructure spending
- This isn't cloud growth—this is the physical cost of keeping GPU clusters running at scale
The Signal
The numbers tell a story about infrastructure whiplash. Total US spending on power-plant equipment is expected to triple through 2030, according to Wood Mackenzie. But the real story is the composition. Data centers went from a rounding error to the dominant customer in the span of half a decade.
In 2025, data center power equipment spending sat at $2.6 billion. By 2030, that number hits $65 billion. That's not incremental cloud expansion. That's training runs that need their own power plants.
"Data centers may account for up to 40% of total US power equipment investment by 2030—making AI the single largest infrastructure customer."
The implications ripple beyond tech. When data centers become the largest share of the total power equipment market, you're watching energy economics get rewritten in real time. Utilities that spent decades planning around residential and industrial demand now have to factor in customers who can add gigawatts of load with a single facility announcement.
This is what happens when the cost of intelligence drops but the cost of computation doesn't. Every foundation model update, every agent deployment, every company spinning up their own inference infrastructure—all of it translates to turbines, transformers, and transmission lines. The AI companies talk about parameters and benchmarks. The power industry is ordering equipment three years out.
Key infrastructure dynamics:
- Power equipment has 18-36 month lead times—decisions being made now lock in 2027-2028 capacity
- Data centers are geographically concentrated, creating regional grid stress points
- Natural gas and renewable capacity are both seeing surges tied to data center co-location deals
The gap between AI capability growth and power infrastructure growth is the real constraint. You can train a model faster every quarter. You can't build a substation faster every quarter. The tripling of total US power equipment spending is the physical world trying to catch up to what the model labs already promised.
The Implication
If you're building AI infrastructure or investing in the picks-and-shovels layer, power isn't a footnote anymore. It's the constraint. Watch where new power generation capacity is being permitted. That's where the next wave of AI clusters gets built. The companies that solve for co-located power, not just cheap land, win the next five years.
For everyone else: the AI buildout is now visible from space, written in gigawatts and copper wire. When data centers command 40% of power equipment spending, they command 40% of the political and regulatory attention that comes with it. Expect grid reliability, carbon intensity, and energy policy to become core AI company concerns, whether they want it or not.