Open any newspaper and you will find at least 3-4 articles on Artificial Intelligence. Everyone wants in, but this revolution may have reached a structural inflection point. Success is no longer determined exclusively by models, software, investment or semiconductors; it will depend on where and how much electricity we can bring to market quickly. As hyperscalers, chipmakers, and AI-first firms compete to build the computational backbone of tomorrow’s economy, they face an uncomfortable truth: the energy and grid infrastructure required to power that ambition may not exist at the scale or speed the industry expects.
The surge in AI infrastructure investment
The numbers are crazy exiting. Data centers accounted for about 1.5% of global electricity consumption in 2024 (IEA), and demand continues to grow at double-digit rates. IEA’ expects that data centers will use 945 terawatt-hours (TWh) in 2030, roughly equivalent to the current yearly electricity consumption of Japan. In the U.S., data-center electricity use rose from about 58 TWh in 2014 to 176 TWh in 2023, with projections suggesting 325–580 TWh by 2028. Globally, nearly $3 trillion of cumulative buildout is expected by 2028 or 2029 when including hyperscale, colocation, and AI training facilities, yet all of that capacity depends on the existence of sufficient electricity supply and grid infrastructure.
Financial markets are racing to support this expansion, deploying long-term capacity contracts, private equity, equipment-backed lending, and structured securitizations of future power or compute revenue streams. These structures enable rapid scaling but also create systemic risk. Investors are betting that today’s generation of AI hardware will remain economically valuable long enough to justify massive capital allocation, a bet that depends on both technology performance and energy abundance.
High-profile initiatives, such as the so-called “Stargate” project associated with OpenAI, Microsoft and its UAE, Oracle and Soft Bank partners, demonstrate the tension between ambition and reality. Early-phase deployments are capital-intensive, while later capacity depends on regulatory approvals, energy procurement, transmission upgrades and yes even environmental compliance. The mismatch between contractual financial commitments and the physical delivery of power highlights the broader infrastructure challenge facing the AI industry.
OpenAI’s ambitious expansion
OpenAI has entered into approximately $1 trillion worth of deals (Yahoo Finance) for computing power to support its artificial intelligence operations, including partnerships with AMD, Nvidia, Oracle, and CoreWeave. These agreements are intended to provide OpenAI access to over 20 gigawatts of computing capacity, equivalent to the output of 20 nuclear reactors, over the next decade. OpenAI’s major investments are aimed at scaling up services like ChatGPT, but the financial commitments far exceed its revenue, posing significant funding challenges.
The company’s deals include complex financial structures involving incentives and equity, such as warrant options from AMD and a $100 billion investment commitment from Nvidia. These arrangements aim to help OpenAI fund chip purchases while simultaneously boosting partner valuations, evidenced by sharp market value increases for AMD and Oracle after their deals were announced.
Despite raising $47 billion in VC funding and $4 billion in bank debt, and being valued at $500 billion, OpenAI is expected to lose $10 billion this year. Experts question its financial sustainability, noting its high capital intensity and uncertain path to profitability. CEO Sam Altman downplayed immediate concerns over profitability, emphasizing long-term growth. However, with such massive spending and investor expectations, any slowdown in AI growth could put OpenAI’s ambitious expansion plans at risk.
The AI energy demand surge
A new report from global risk firm DNV projects that power demand from AI-driven data centers will rise tenfold by 2030. North America, particularly the United States and Canada, are expected to be the primary driver of this surge in energy consumption. Despite this significant increase, DNV states that AI will likely still account for less than 3% of global electricity usage by 2040, remaining lower than sectors like electric vehicle charging and building cooling. The report is based on an analysis of over 50 global data center energy estimates and places DNV on the more conservative side of projections. This finding adds to the ongoing debate about the energy implications of expanding artificial intelligence infrastructure.
Bloomberg NEF forecasts that U.S. data-center power demand will more than double by 2035, rising from almost 35 gigawatts in 2024 to 78 gigawatts. Actual energy consumption growth will be even steeper, with average hourly electricity demand nearly tripling, from 16 gigawatt-hours in 2024 to 49 gigawatt-hours by 2035.
Drawing parallels with the 1990s internet boom
The current AI infrastructure boom bears striking similarities to the internet infrastructure expansion of the 1990s. During that period, massive investments were made in building the foundational infrastructure for the internet, including data centers, fiber-optic cables, and networking equipment. However, many of these investments were made without fully understanding the future demand driven by the fear of missing out, leading to overcapacity and financial losses for some companies.
Similarly, the rapid scaling of AI infrastructure today is being driven by optimistic projections of future demand. While AI has can revolutionize entire industries, the current pace of investment may be outpacing the actual demand for AI services. This could lead to a situation where companies have invested heavily in infrastructure that is underutilized, resulting in financial strain and potential market corrections.
Moreover, just as the internet boom of the 1990s led to the bursting of the dot-com bubble, there are concerns that the current AI infrastructure boom could lead to an “AI bubble.” Analysts caution that the combination of sky-high expectations and immense capital commitments has created the outlines of an AI bubble, where hype may outpace deliverable capacity.
Conclusion
To put it bluntly, the $3 trillion planned investment in AI infrastructure is as much a bet on energy and grid capacity as it is on machine learning and quantum. Success will depend on unlikely coordination between technology firms, utilities, financial markets, and governments. Failure could result not only in financial losses but in a fundamental recalibration of expectations about AI’s role in the global economy. The next five years will determine whether AI fulfills its transformative promise or confronts the constraints of the energy systems on which it depends.