Business

Is energy the real AI bottleneck? What investors need to know

Before the COVID-19 pandemic, U.S. energy demand was fairly predictable: It was driven by steady population increases and economic growth. According to Energy Information Administration (EIA) data, U.S. electricity consumption increased by an average annual growth rate of just 0.1% from 2005 to 2020. But the insatiable energy appetite of artificial intelligence (AI) data centers has carved an ever-growing wrinkle in that formerly predictable model.

Occasional updates to aging infrastructure are no longer good enough. The energy demand from AI data centers is real, large, and structurally constrained - but the bottleneck creates opportunity.

The Motley Fool examines the energy demands of AI data centers, the infrastructure bottlenecks slowing the build-out, and what the dynamics mean for investors.

Key Points

  • AI energy consumption is reshaping the U.S. energy landscape.
  • Producing more energy only addresses part of the challenge.
  • The AI data center power grid will likely remain heavily dependent on fossil fuels through at least 2030.

How much energy do AI data centers use?

U.S. data centers consumed an estimated 177 to 192 terawatt-hours (TWh) of electricity in 2024 - roughly 4% to 5% of all U.S. electricity - and could consume 9% to 17% by 2030 under scenarios developed by the Electric Power Research Institute (EPRI). The updated range is about 60% higher than EPRI's own projections from 18 months earlier. But most of that energy usage is from conventional data centers.

A hyperscale AI data center uses as much power as 100,000 homes - and the largest under construction uses 20 times that. A conventional data center is 10 to 25 megawatts (MW). But hyperscale AI data centers are often classified as 100 MW or higher, which is enough energy to power 100,000 homes. The largest hyperscale data center currently under construction exceeds 2 GW, and the largest planned is 5 GW, says the International Energy Agency (IEA). That is a staggering 20 times a hyperscale data center and 200 times the high end of a conventional data center.

Data centers are growing in size because AI is far more energy-intensive than basic search. Consider that a single ChatGPT AI query uses 10 times as much electricity as a traditional Google Search, according to the EPRI. GPT-4 training required around 42.4 GWh over 14 weeks, equivalent to the daily electricity use of about 28,500 households in advanced economies, says the IEA.

Training AI models and chat-based use are already challenges for the grid, but the real constraints are the widespread adoption of AI agents and the increased use of AI inference, which involves applying what an AI has been trained to do. Key chip designers, such as Nvidia and Broadcom, are developing specialized AI chips and networking solutions for inference applications. According to IEA data, the B200 GPU, which is part of Nvidia's Blackwell architecture, is 60% more energy efficient per FLOP/watt than its H100, which is 80% more efficient than the A100.

Efficiency improvements in IT equipment represent the greatest opportunity to reduce AI data center usage. IT equipment accounts for 40% to 50% of data center energy, cooling for 30% to 40%, and auxiliary for 10% to 30%, says the EPRI. Data centers are growing in size not just because of increased use of models like OpenAI's ChatGPT, Anthropic's Claude models, Alphabet-owned Google Gemini, and Microsoft Copilot, but because inferencing is broadening the scope of AI's role in business and daily life.

Investors should pay attention to three structural factors: the accelerating scale of the AI data center build-out, energy demands for computing power (especially inference), and efficiency gains that history suggests won't keep pace with these new energy demands. These factors point to durable, structural growth in electricity demand, not a cyclical spike.

How much energy will AI data centers require?

Data centers could consume 9% to 17% of U.S. electricity by 2030. However, the range depends on how many planned projects actually get built. The EPRI's latest forecast is 60% higher than its 2025 estimates and represents a double to quadruple increase from 2024 levels. EPRI 2026 notes its 2030 range is broadly consistent with Lawrence Berkeley National Laboratory (LBNL) data through 2028, despite different methodologies. LBNL is forecasting 325 to 580 TWh of data center electricity demand by 2028, which would be 6.7% to 12% of total projected U.S. electric demand.

 The Motley Fool
The Motley Fool



How will more energy be provided?

Despite massive investment in solar photovoltaic (PV), onshore and offshore wind, battery energy storage, and nuclear energy, fossil fuels still dominate the U.S. energy mix.

  • Natural gas fills the near-term data center energy gap; renewables lead after 2030. According to 2025 IEA data, natural gas accounts for 40% of U.S. data center electricity usage, making it the largest single source, ahead of 24% from renewable energy, 20% from nuclear, and 15% from coal. As of the fourth quarter of 2024, the IEA projects U.S. utility-scale additional gas-fired capacity to increase by 84 GW by 2030 and wind and solar to increase by 260 GW. It's worth noting that the gas-fired forecast is 32 GW higher than end-of-2023 plans, while renewables are 20 GW lower, underscoring that energy availability remains a priority - even over sustainability. Shifts in federal policy account for part of the reason for the renewables slowdown.
  • EPRI data projects higher wind and solar buildouts due to Inflation Reduction Act tax credits that were since curtailed under the 2025 budget bill. Natural gas capacity is now expected to increase by 6.6 to 13.7 GW/year from 2025–2030, above the 5.7 GW/year historical average, says the EPRI. Even with the higher-than-anticipated natural gas build-out, low-emissions sources are projected to account for at least 55% of U.S. data center electricity by 2035, according to the IEA.
  • Globally, renewables are expected to increase at a compound annual growth rate of 22% from 2024 to 2030 to meet approximately 50% of additional global data center demand by 2030. Interestingly, natural gas and nuclear are both expected to add 175 TWh by 2035 to meet data center demand. Already, up to 25 GW of small modular reactor (SMR) capacity globally has been accounted for, with the first SMRs expected to be operational by around 2030. SMRs can help take the pressure off of utility-scale nuclear power plants. For example, Southern Company's Vogtle Nuclear Power Plant in Georgia recently announced an expansion that increased its total power capacity to a mind-numbing 4.8 GW - enough to power 2 million homes and businesses.
 The Motley Fool
The Motley Fool



Investors should focus more on total energy demand rather than renewables versus fossil fuels in the energy mix. Over the long term, solar, wind, battery energy storage, and nuclear will likely make up a higher proportion of the electricity mix than natural gas and coal, but natural gas consumption could still be far higher five to 10 years from now than today, given AI's outsize energy demands.

Will the AI revolution hit an energy bottleneck?

AI-driven energy demand is causing bottlenecks across the supply chain. Gas turbine deliveries are reportedly delayed by several years, and transformer backlogs are up by at least 30%, while transformer prices are up 1.5 times since 2020, says the IEA. According to EPRI 2026 data, information technology (IT) and power equipment, as well as skilled labor, are both regional and national challenges. Infrastructure development timelines paint an uncertain future for AI data center energy capacity.

  • Data centers take one to three years to build, but new transmission lines take four to eight years, and new-generation plants take two to 15 years, depending on the type. And that's not even factoring in the previously discussed specialized industrial machinery bottlenecks.
  • 20% of planned data centers could face grid delays by 2030 - and the pipeline is bigger than anyone realized. And 50% of U.S. data centers under development are in existing large clusters, raising the risk of local bottlenecks, says the IEA.
  • Virginia, Northern Europe, and Japan are already hitting connection limits. North Virginia connection queues have ballooned to seven years, compared to five to seven years for the U.K., up to 10 years for the Netherlands, and a full-blown pause for Ireland, according to the IEA.

These bottlenecks, paired with developers that are racing to bring more power to the grid, are sparking a surge in behind-the-meter (BTM) energy generation and storage focused on bringing power directly to a data center rather than relying on the constrained grid.

Bridging the gap between planned and operating AI data center projects takes more than just funding and next-generation chips. Access to networking, IT equipment, and specialized industrial machinery is equally important. Similarly, the energy bottleneck isn't just about producing more energy. Transmission, permitting, and regulatory hurdles are also worth considering. Therefore, it may be useful for investors to think about the AI infrastructure build-out as an interconnected circulatory system with multiple arteries that could get blocked by various factors, rather than a single pathway.

Adjacent opportunities: Cooling, grid infrastructure, backup power, and efficiency

With the grid constrained by supply chain bottlenecks and connection hurdles, batteries, cooling solutions, hardware/software efficiency, grid connections, and backup power are poised to benefit from the demand-driven tailwinds of AI data centers.

  • Beyond the data center: The cooling, grid, and efficiency markets the AI build-out is creating. Power usage effectiveness (PUE) measures the energy efficiency of data centers by dividing total facility energy, such as cooling, lighting, and power distribution, by IT equipment energy. A PUE of 1 would indicate 100% of electricity is consumed solely by IT equipment. According to the IEA, the global weighted-average PUE is expected to improve from 1.41 to 1.29 between 2024 and 2030, saving 90 TWh of energy. The U.S. is already ahead of the curve, with PUEs averaging 1.32 in 2024. According to 2026 EPRI data, large hyperscale facilities with liquid cooling currently under construction could achieve PUEs of 1.1.
  • Batteries, transformers, and cooling: the infrastructure layer investors may be underweighting. Battery energy storage plays a supporting role in enabling higher renewable matching for data centers. It remains eligible for investment tax credits under current policy, says the EPRI.

Investors should pay close attention to battery energy storage, as it arguably offers the best antidote to the AI energy bottleneck. Battery energy storage is a uniquely positioned renewable energy solution that can power AI data centers without relying on the constrained grid through site-specific and BTM options. Utility-scale solar paired with battery energy storage offers an alternative to fossil fuels while addressing solar's intermittency when the sun isn't shining.

What the energy data tells investors

The AI data center build-out is moving faster than initially anticipated, leading to upward revisions to projections.

Here are five things to watch:

  1. Hyperscale capex is the driving force behind data center development. If capex growth rates cool, grid and energy supply constraints will likely ease.
  2. Industrial machinery backlogs and power constraints could become a greater bottleneck than AI chips and networking equipment.
  3. Despite the development of utility-scale renewable energy projects, the U.S. energy mix remains heavily dependent on fossil fuels, though nuclear energy could make a significant impact after 2030.
  4. Hyperscale data center PUE improvements could reduce energy needs.
  5. Increased AI adoption creates opportunities for grid infrastructure (transformers, cables, transmission), cooling systems, backup power, efficiency hardware, and demand-flexibility technology.

Backlogs in chip production are straining data center availability and access, but energy is an equally important limiting factor. AI computing challenges can be solved by producing more chips and building data centers, but energy is more nuanced. Even if the energy market could flip a switch to increase supply, there are still environmental and regulatory hurdles to overcome. Perhaps the biggest challenge is aligning supply with forecasted demand, especially given that those forecasts have changed so much in the last two years.

FAQs

How much energy does AI use?

According to 2026 EPRI data, U.S. data center electricity consumption in 2024 was 177 to 192 TWh, which is approximately double 2021 levels.

How much energy does an AI data center use?

New data centers range from 100 to 1,000 MW, which is equivalent to the load of 80,000 to 800,000 average homes. This data is consistent across EPRI 2024, EPRI 2026, and IEA data.

How much energy will the AI data center build-out require?

2026 EPRI data estimated U.S. data center annual electricity consumption will range from 383 TWh by 2030 on the low end to a medium scenario of 596 TWh to a high scenario of 793 TWh. IEA 2025 data projects global data center electricity consumption to range from a base-case 945 TWh in 2030 to 1,260 TWh under the lift-off case.

What types of energy are used to power AI data centers?

The electricity mix of U.S. data centers is currently 40% natural gas, 24% renewables (mostly solar PV and wind), 20% nuclear, and 15% coal, according to 2025 IEA data.

This story was produced by The Motley Fool and reviewed and distributed by Stacker.

Copyright 2026 Stacker Media, LLC

This story was originally published April 23, 2026 at 5:30 AM.

Get unlimited digital access
#ReadLocal

Try 1 month for $1

CLAIM OFFER