AI Data Center Boom Carries Huge Default and Demand Risks

“How does the digital economy exist?” asked John Medina, a senior vice president at Moody’s, who specializes in assessing infrastructure investments. “It exists on data centers.”

New investments in data centers to power Artificial Intelligence (AI) are projected to reach $3 trillion to $4 trillion by 2030, according to Nvidia. Other estimates suggest the investment needed to keep pace with AI demand could be as high as $7 trillion by 2030, according to McKinsey. This massive spending is already having a significant economic impact, with some analysis indicating that AI data center expenditure has surpassed the total impact from US consumer spending on GDP growth in 2025.

U.S. data center demand, driven largely by A.I., could triple by 2030, according to McKinsey.  That would require data centers to make nearly $7 trillion in investment to keep up. OpenAI, SoftBank and Oracle recently announced a pact to invest $500 billion in A.I. infrastructure through 2029. Meta and Alphabet are also investing billions. Merely saying “please” and “thank you” to a chatbot eats up tens of millions of dollars in processing power, according to OpenAI’s chief executive, Sam Altman.

Hyperscale cloud providers such as Microsoft, Amazon AWS, Google, and Meta are committing massive capital to building AI-specific facilities. Microsoft, for example, is investing $80 billion in fiscal 2025 for AI-enabled data centers. Other significant investments include: 
  • OpenAI, SoftBank, and Oracle pledging to invest $500 billion in AI infrastructure through 2029.
  • Nvidia and Intel collaborating to develop AI infrastructure, with Nvidia investing $5 billion in Intel stock.
  • Microsoft spending $4 billion on a second data center in Wisconsin.
  • Amazon planning to invest $20 billion in Pennsylvania for AI infrastructure.

Compute and Storage Servers within an AI Data Center.  Photo credit: iStock quantic69

The spending frenzy comes with a big default risk. According to Moody’s, structured finance has become a popular way to pay for new data center projects, with more than $9 billion of issuance in the commercial mortgage-backed security and asset-backed security markets during the first four months of 2025. Meta, for example, tapped the bond manager Pimco to issue $26 billion in bonds to finance its data center expansion plans.

As more debt enters these data center build-out transactions, analysts and lenders are putting more emphasis on lease terms for third-party developers. “Does the debt get paid off in that lease term, or does the tenant’s lease need to be renewed?” Medina of Moody’s said. “What we’re seeing often is there is lease renewal risk, because who knows what the markets or what the world will even be like from a technology perspective at that time.”

Even if A.I. proliferates, demand for processing power may not. Chinese technology company DeepSeek has demonstrated that A.I. models can produce reliable outputs with less computing power. As A.I. companies make their models more efficient, data center demand could drop, making it much harder to turn investments in A.I. infrastructure into profit. After Microsoft backed out of a $1 billion data center investment in March, UBS wrote that the company, which has lease obligations of roughly $175 billion, most likely overcommitted.

Some worry costs will always be too high for profits. In a blog post on his company’s website, Harris Kupperman, a self-described boomer investor and the founder of the hedge fund Praetorian Capital, laid out his bearish case on A.I. infrastructure. Because the building needs upkeep and the chips and other technology will continually evolve, he argued that data centers will depreciate faster than they can generate revenue.

“Even worse, since losing the A.I. race is potentially existential, all future cash flow, for years into the future, may also have to be funneled into data centers with fabulously negative returns on capital,” he added. “However, lighting hundreds of billions on fire may seem preferable than losing out to a competitor, despite not even knowing what the prize ultimately is.”

It’s not just Silicon Valley with skin in the game. State budgets are being upended by tax incentives given to developers of A.I. data centers. According to Good Jobs First, a nonprofit that promotes corporate and government accountability in economic development, at least 10 states so far have lost more than $100 million per year in tax revenue to data centers. But the true monetary impact may never be truly known: Over one-third of states that offer tax incentives for data centers do not disclose aggregate revenue loss.

Local governments are also heralding the expansion of energy infrastructure to support the surge of data centers. Phoenix, for example, is expected to grow its data center power capacity by over 500 percent in the coming years — enough power to support over 4.3 million households. Virginia, which has more than 50 new data centers in the works, has contracted the state’s largest utility company, Dominion, to build 40 gigawatts of additional capacity to meet demand — triple the size of the current grid.

The stakes extend beyond finance. The big bump in data center activity has been linked to distorted residential power readings across the country. And according to the International Energy Agency, a 100-megawatt data center, which uses water to cool servers, consumes roughly two million liters of water per day, equivalent to 6,500 households. This puts strain on water supply for nearby residential communities, a majority of which, according to Bloomberg News, are already facing high levels of water stress.


Key Qual

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*