Schneider Electric warns of future where datacenters eat the grid
Report charts four scenarios from 'Sustainable AI' to 'Who Turned Out The Lights?'
Policymakers need to carefully guide the future consumption of electricity by AI datacenters, according to a report that considers four potential scenarios and suggests a number of guiding principles to prevent it from spiraling out of control.
The research published by energy infrastructure biz Schneider Electric follows the IEA Global Conference on Energy & AI last month. Titled Artificial Intelligence and Electricity: A System Dynamics Approach, it looks at the emerging schools of thought relating to AI and the associated impact on electricity consumption.
Much has already been reported on the rise of AI, and especially generative AI, which has led to huge investment in high-performance and power-hungry infrastructure for the purposes of developing and training models.
As the report notes, existing datacenter infrastructure requires significant energy to function, and will need extra resources to support the anticipated growth in AI adoption. This has already been causing concerns about the potential strain on electricity grids and the possible environmental impact if energy demand to power AI continues to rise at its current rate.
Schneider has modeled four distinct scenarios, which it has labeled as: Sustainable AI; Limits To Growth; Abundance Without Boundaries; and Energy Crisis. All four forecast a general upward trend in energy consumption for the period 2025 to 2030, but diverge notably after this based on the assumptions underpinning each one.
Sustainable AI looks at the potential outcome of prioritizing efficiency while energy consumption steadily increases, whereas Limits To Growth outlines a constrained path where AI development hits natural or human-related limits. Abundance Without Boundaries considers the potential risks of unchecked growth, while the Energy Crisis scenario examines how mismatched energy demand and generation would potentially lead to widespread shortages.
According to Schneider, Sustainable AI represents a promising approach that would see energy consumption rise from an expected 100 terawatt-hours (TWh) in 2025 to 785 TWh in 2035, according to its model.
Under this scenario, GenAI inferencing will become the primary driver of electricity consumption within the AI sector by 2027-2028, but there will also be a shift towards more efficient and less energy-intensive models. The report states that it is "characterized by a symbiotic relationship between AI infrastructure and demand, where efficiency and resource conservation are mutually reinforced."
With Limits To Growth, the continued uptake of GenAI inferencing is found to be susceptible to constraints from power and infrastructure. The report foresees total AI energy use growing from the baseline of 100 TWh in 2025 to 510 TWh by 2030, but with challenges such as grid power availability in key datacenter hubs, manufacturing bottlenecks for specialized AI chips, and scarcity of data for large language models all taking their toll.
The Abundance Without Boundaries scenario indicates that the rapid and unrestrained development of AI systems poses the risk of a continual arms race towards bigger and more powerful infrastructure, outpacing the capacity for sustainable resource utilization.
Schneider forecasts total AI energy consumption to rise enormously from the 100 TWh in 2025 to 880 TWh by 2030, continuing on an upwards trajectory and reaching a staggering 1,370 TWh in 2035.
This scenario displays the Jevons Paradox, where improvements in AI efficiency paradoxically lead to increased overall energy consumption. It forecasts that AI and datacenters will expand without barriers, as techno-optimists drive rapid AI deployment across all sectors, believing that AI advances will solve any resource constraints.
Finally, the Energy Crisis model foresees the rapid growth of AI leading to its energy demands conflicting with other critical sectors of the economy. This triggers various negative outcomes, including economic downturns and severe operational challenges for AI-dependent industries.
Here, AI energy consumption is forecast to peak around 2029, reaching approximately 670 TWh, followed by a drop to about 380 TWh by 2032 and a further reduction to 190 TWh in 2035. Uncoordinated AI governance will lead to fragmented policies, which could result in global or localized energy shortages, according to the report.
Schneider lists a number of recommendations for sustainable AI, which break down into three main areas: AI Infrastructure; AI Development; and Governance, Standards, and Education.
The first of these advocates that next-generation datacenters should be optimized with the latest cooling technologies, high-density compute, and modern energy-efficient AI hardware such as GPUs and TPUs. Operators should regularly assess and upgrade infrastructure while aiming to improve Power Usage Effectiveness (PUE) in datacenters.
- Cloudy with a chance of GPU bills: AI's energy appetite has CIOs sweating
- Datacenters could blow up your electric bill thanks to AI
- AI's power trip will leave energy grids begging for mercy by 2027
- Datacenter developer says power issues holding up new builds
It also suggests accelerating deployment of on-site renewable energy generation combined with advanced energy storage solutions to ensure a stable power supply, investing in technologies such as solid-state batteries or hydrogen storage.
Utilities should also plan for the growing energy demands of AI, which will involve working with energy providers, policymakers, and AI companies to align on comprehensive strategies.
With AI Development, the recommendations are to make models more efficient through techniques such as model pruning, quantization, and lightweight architectures, while also developing measured AI hardware power profiles, the report states.
AI companies should establish clear key performance indicators (KPIs) for AI projects that include energy efficiency and environmental impact alongside business outcomes, while circular economy principles should be applied to AI hardware and software to minimize negative impact.
As far as Governance, Standards, and Education goes, Schneider says that policymakers should develop and put in place certification schemes for sustainable AI practices with clear, measurable criteria for energy efficiency and environmental impact.
Additionally, robust AI governance frameworks should guide responsible AI development and deployment, addressing energy consumption, data privacy, and ethical considerations.
The report also advocates AI education programs to emphasize sustainable practices as crucial for building a workforce equipped to address future challenges. Companies should establish partnerships with educational institutions to create training programs that combine technical AI skills with environmental awareness, for example.
A large part of the report is given over to appendices discussing the methods the Schneider researchers adopted in developing and informing their scenarios, for anyone with an interest. This involved creating system dynamics future models to try and answer "what if" questions about possible outcomes, for example, along with the various factors and weightings that affect these.
However, the authors also add in a disclaimer that they are aware of the compromises involved in trying to forecast future scenarios. The report says that while the study provides insights into potential AI electricity consumption scenarios, it highlights areas that require further investigation.
These include a better understanding of AI's environmental footprint through a comprehensive lifecycle assessment covering manufacturing, datacenter construction, and end-of-life disposal. Future research should also improve on the system dynamics models used in this study to more effectively capture the dynamic nature of AI demand across different sectors and applications, the report states.
In a foreword, the Director of Schneider Electric Sustainability Research Institute, Rémi Paccou, says that the research is not meant to be prescriptive, but that by exploring these potential futures, the hope is to prepare stakeholders to navigate the challenges and opportunities that lie ahead.
"Instead, we hope it serves as a starting point for informed discussion and decision-making. We present our findings with the understanding that AI is a rapidly evolving field and that our knowledge is constantly growing," he said.
The overall message is that governments and industry leaders need to strategically plan to balance AI growth with environmental and economic sustainability. Whether they do so is another matter. ®