China vs U.S.: Race to Generate Power for AI Data Centers as Electricity Demand Soars

The International Energy Agency (IEA) forecasts that in the next five years, the global demand for power (electricity) is set to grow roughly 50% faster than it did during the previous decade – and more than twice as fast as energy demand overall.  That tremendous increase in demand is due to power hungry AI data centers.  There’s also electric cars and buses, electric-powered industrial machines, and electric heating of homes.

Global AI growth will be contingent on generating more power for data centers:

  • Global data center power demand is now expected to rise to a record 1,596 terawatt-hours by 2035 – +255% increase from 2025 levels.
  • The U.S. is set to remain the leader in energy consumption with a +144% surge in demand over this period, to 430 terawatt-hours.
  • China’s demand is projected to rise +255%, to 397 terawatt-hours.
  • European demand is expected to surge +303%, to 274 terawatt-hours.
  • New data centers coming online between now and 2030 will need more than 600 terawatt-hours of electricity. This is enough to power ~60 million homes.

 

Power for AI Data Centers: China vs U.S.:

China is currently ahead of the United States in generating and building out power infrastructure to support AI data centers, a phenomenon sometimes described by industry observers as an “electron gap.”

China’s rapid, centralized expansion of electricity generation—including both massive renewable projects and traditional, dispatchable power—has created a significant capacity advantage in the race to support AI workloads, which are increasingly limited by energy availability rather than just chip access.

Key factors in China’s power advantage for AI include:

Massive Generation Growth: Between 2010 and 2024, China’s power production increased by more than the rest of the world combined. In 2024 alone, China added 543 gigawatts of power capacity—more than the total capacity added by the U.S. in its entire history.

Significant Surplus Capacity: By 2030, China is projected to have roughly 400 gigawatts of spare power capacity, which is triple the expected power demand of the global data center fleet at that time.

“Eastern Data, Western Computing” Initiative: China is actively shifting energy-intensive data centers to its resource-rich western regions (like Inner Mongolia) while powering them with surplus renewable energy, such as wind and solar.

Lower Costs and Faster Buildouts: Data centers in China can pay less than half the rates for electricity that American data centers do. Furthermore, projects in China can move from planning to operation in months, compared to years in the U.S. due to faster permitting and fewer regulatory hurdles.

Conclusions:

While the U.S. currently leads in advanced AI chips and model development, it is facing a severe “energy bottleneck” for new data centers, with some requiring over a gigawatt of power. U.S. power demand has remained relatively flat for 20 years, resulting in a lag in building new capacity, whereas China has traditionally built power infrastructure in anticipation of high demand. Morgan Stanley has forecast that U.S. data centers could face a 44-gigawatt electricity shortfall in the next three years.

Despite China’s advantage in energy, U.S. export controls on high-end AI chips (such as Nvidia’s GPUs) have acted as a significant constraint on China’s actual AI compute power. This has led to a situation where the U.S. has the best “brains” (chips) but limited power to run them, while China has the “muscle” (energy) but limited access to top-tier AI brains.

However, the rapid improvements in Chinese AI models (such as DeepSeek), which are more energy-efficient and optimized for lower-tier hardware, may help mitigate this constraint.

References:

https://www.bloomberg.com/news/newsletters/2026-02-14/ai-battle-turbocharged-by-50-power-demand-surge-new-economy

https://www.iea.org/reports/electricity-2026

https://x.com/KobeissiLetter/status/2023437717888250284

How will the United States and China power the AI race?

Big tech spending on AI data centers and infrastructure vs the fiber optic buildout during the dot-com boom (& bust)

Analysis: Ethernet gains on InfiniBand in data center connectivity market; White Box/ODM vendors top choice for AI hyperscalers

Fiber Optic Boost: Corning and Meta in multiyear $6 billion deal to accelerate U.S data center buildout

How will fiber and equipment vendors meet the increased demand for fiber optics in 2026 due to AI data center buildouts?

Analysis: Cisco, HPE/Juniper, and Nvidia network equipment for AI data centers

Networking chips and modules for AI data centers: Infiniband, Ultra Ethernet, Optical Connections

Nvidia CEO Huang: AI is the largest infrastructure buildout in human history; AI Data Center CAPEX will generate new revenue streams for operators

 

2 thoughts on “China vs U.S.: Race to Generate Power for AI Data Centers as Electricity Demand Soars

  1. China’s AI sector is experiencing a massive boom, driven by intense government support, strategic self-reliance in hardware, and rapid, large-scale industrial adoption. Poised to be a global leader by 2030, the market could reach $1.4 trillion, with key growth areas including manufacturing, robotics, and consumer AI apps.

    Encouraging AI development and use is a national China government policy. It seems very effective to date as per these metrics:

    Industrial Policy & Self-Reliance: Facing U.S. export controls, China is aggressively pursuing AI chip self-reliance, driving up the valuations of domestic players like Cambricon and Moore Threads.

    Government-Driven Infrastructure: Beijing is fostering a “national algorithm registry,” with over 400,000 data enterprises supporting a digital economy valued at ~5.86 trillion yuan (approx. $826 billion) in 2024.

    Physical & Consumer AI:
    China is dominating in AI that interacts with the physical world, including humanoid robots from firms like Unitree, as well as AI-powered eyewear and consumer electronics.

    Massive Adoption & Data: With over half a billion users, tools like Baidu’s Ernie Bot have reached massive scale, creating an unrivaled testing ground for rapid deployment.

    Strategic Cost-Cutting: AI is being heavily deployed in logistics and manufacturing to optimize, reduce costs, and offset economic pressures.

    Despite challenges, the focus is on achieving “AI Plus” initiatives—integrating AI into every sector of the economy and sustaining competitiveness through rapid, open-source innovation (e.g., DeepSeek) and massive, high-efficiency compute investment.

  2. America’s AI datacenter boom isn’t just slowing down, it’s threatening to take our power grid down with it. These facilities consume as much electricity as major cities, and our grid simply wasn’t built to handle it. If Congress doesn’t act, the consequences won’t be limited to the tech industry. They’ll ripple across the entire economy.

    Datacenters are already straining resources that the rest of us depend on:

    The datacenter boom is gobbling up critical electronic components, like memory and storage chips, creating shortages that could trigger a crisis for consumer electronics companies and the businesses that depend on them.

    High-voltage transformers now take up to 4 years to manufacture, with costs up to six times pre-2022 levels, leaving the broader grid buildout starved of essential equipment.

    Some datacenters are already bypassing the grid entirely, firing up dirty natural gas generators to meet power demands that existing infrastructure can’t satisfy, worsening our climate crisis in the process.

    Grid operators like PJM are already overwhelmed, pushing connection timelines to 2030 and beyond — and that’s before accounting for the full wave of proposed facilities.

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*