Every time you run an AI query, somewhere a server rack spins up — drawing power from a grid that was never designed to carry this load. The race to build bigger, faster AI systems has quietly become one of the most consequential energy stories of our time, with implications reaching from power bills in Ohio to carbon targets in Brussels.
The numbers are stark. U.S. data centers consumed 183 terawatt-hours of electricity in 2024 — more than 4% of the country’s total electricity consumption, roughly equivalent to the annual electricity demand of Pakistan. That figure is projected to more than double by 2030, reaching around 945 terawatt-hours globally — a trajectory the world’s power infrastructure is struggling to keep pace with.
This is not just an abstract infrastructure story. It is a question about who builds the future, who pays for it, and whether the promises of technological progress can coexist with commitments to a liveable planet.
A crisis hiding in plain sight
For years, data centers were treated as someone else’s problem — an industry concern, a zoning issue, an occasional item in a regional news story about water use. That has changed.
In the PJM electricity market, which stretches from Illinois to North Carolina, data centers accounted for an estimated $9.3 billion price increase in the 2025–26 capacity market. The average residential bill is expected to rise by $18 a month in western Maryland and $16 a month in Ohio as utilities scramble to serve massive new AI facilities. A study from Carnegie Mellon University estimates that data centers and cryptocurrency mining could lead to an 8% increase in the average U.S. electricity bill by 2030, potentially exceeding 25% in the highest-demand markets of central and northern Virginia.
In Virginia’s so-called data center alley, the concentration is already extraordinary. In 2023, data centers consumed about 26% of the total electricity supply in Virginia — more than a quarter of an entire state’s power, flowing into facilities most residents will never see. Ireland offers a similarly striking case: around 21% of the nation’s electricity now goes to data centers, with projections suggesting this could reach 32% by 2026.
The power grid includes all the infrastructure to generate and deliver power to our homes, our businesses — and now to AI data centers. In some regions, utilities simply can't keep up.
— Noman Bashir, MIT researcher and TED speaker
Why the grid cannot simply absorb the demand
The challenge is not just volume — it is timing and geography. Data centers, unlike electric vehicles, tend to concentrate in specific locations, making their integration into the grid potentially more challenging. A single hyperscale AI facility can draw power equivalent to a small city, arriving in a region with existing infrastructure constraints and long connection queues.
Grid connection queues for both supply and consumption projects are long and complex. Building new transmission lines can take four to eight years in advanced economies, and wait times for critical grid components such as transformers and cables have doubled in the past three years.
This is where the argument in Bent Flyvbjerg and Dan Gardner’s How big things get done (2023) becomes unexpectedly relevant. Flyvbjerg’s central thesis is that the most common failure mode of large infrastructure projects is optimism bias — the tendency to underestimate complexity and overestimate the pace of delivery. The AI industry’s relationship with the grid looks like a textbook example: massive capital commitments made on the assumption that power will be available, before the hard work of actually connecting to it has been done. AI data centers in Northern Virginia now face multi-year delays simply to connect to the grid, prompting companies to delay projects, contract power directly from private producers, and install multiple inefficient reciprocating generators using natural gas.
The clean energy paradox
There is a version of this story that feels hopeful. The AI boom has arrived alongside the renewable energy boom, and some analysts argue these forces can be aligned rather than opposed. Half of the global growth in data center demand is projected to be met by renewables, supported by storage and the broader electricity grid. Renewables generation is projected to grow by over 450 terawatt-hours to meet data center demand to 2035.
But the reality is more complicated. As of 2024, natural gas supplied over 40% of electricity for U.S. data centers, while renewables supplied about 24%. The gap between the green narrative and the fossil-fuelled reality is wide — and widening as demand accelerates faster than clean capacity can be built.
The idea that technology will solve the problem it is creating has a long and troubled history. Christophe Bonneuil and Jean-Baptiste Fressoz make this case with force in The shock of the Anthropocene (2016), arguing that the convenient story of modern environmental “awakening” has always served to obscure how deeply aware industrial actors were of the damage they were causing — even as they continued causing it. The AI industry’s energy story fits this pattern uncomfortably well: the harm is documented, the alternatives are theoretically available, and the trajectory continues regardless.
Flexibility as a partial answer
The TED talk that prompted this article presents one genuinely interesting technical response to the grid problem. Its central argument is that AI data centers, unlike hospitals or homes, run workloads that are predictable, controllable, and often delayable. This makes them candidates for what researchers call “demand flexibility” — the ability to shift computing loads to times when renewable energy is abundant, or to scale down when the grid is under stress.
Electricity consumption in accelerated servers, mainly driven by AI adoption, is projected to grow by 30% annually — which means the window for deploying flexible demand management is narrow. If data centers can be reconfigured to absorb excess solar in the afternoon and ease off at evening peak times, they become a grid asset rather than a liability. This is a real engineering possibility, and serious research supports it.
But flexibility does not reduce total demand. It redistributes it. And the communities living near data center corridors — facing higher electricity bills, degraded power quality, and strained local infrastructure — are not primarily asking for smarter scheduling. They are asking who decided that this scale of development should happen here, and who bears the cost.
The question isn't how much energy AI consumes. The real question is how much flexibility, resilience and clean power can AI unlock?
— Noman Bashir, MIT
What sufficiency looks like
The sustainability conversation around AI energy tends to default to two positions: uncritical enthusiasm for technological solutions, or blanket opposition that ignores the genuine benefits of the technology. Neither is adequate.
A more honest accounting would start by asking what we are actually gaining from this energy expenditure — and for whom. The grid strain being experienced in Virginia and Ohio is not evenly distributed. The benefits of AI systems flow primarily to the companies building them and the users wealthy enough to access them. The costs flow to electricity ratepayers, to communities near data center clusters, and to a carbon budget that is already overdrawn.
The IEA notes that data centers represent one of the few sectors where emissions are set to grow, alongside road transport and aviation, as most other sectors are expected to decarbonise in the coming years. This is not a footnote. It is a choice.
The principle of sufficiency — using what is needed rather than maximising what is possible — is central to any serious sustainability practice. It applies to consumption habits and dietary choices. There is no reason it should not also apply to the infrastructure choices that shape how much electricity an industry demands from a grid that everyone depends on.
The grid is not ready for AI. Whether AI is ready to meet the grid — and the planet — on more honest terms is a question the industry has barely begun to ask.
Browse the full Sustainability archive for more.