The explosive growth of AI-powered computing centers has led to an unprecedented surge in electricity demand, which threatens to overwhelm the power grid and derail climate targets. At the same time, artificial intelligence technology can revolutionize energy systems and accelerate the transition to clean electricity.
William H. Green, director of the MIT Energy Initiative (MITEI), said William H. Green, director of the MIT Energy Initiative (MITEI), MIT Spring symposium held on May 13th. “Achieving our clean energy goals while we are trying to enjoy the benefits of AI without harm.” The challenge of AI’s potential benefits for data center energy demand and the transition to energy is a MITEI research priority.
AI’s incredible energy demands
From the beginning, the symposium highlighted the plain statistics on AI’s appetite for electricity. After decades of flat electricity demand in the US, computing centers now consume around 4% of US electricity. There is great uncertainty, but some forecasts show that this demand could increase to 12-15% by 2030, driven primarily by artificial intelligence applications.
Vijay Gadepally, a senior scientist at the Lincoln Institute at MIT, highlighted the scale of AI consumption. “The force required to maintain some of these large models is doubled almost every three months,” he noted. “In one ChatGPT conversation, you use as much electricity as you would charge your phone and generate images, and consume about a bottle of water for cooling.”
Facility that requires 50-100 megawatts of power relying on large language programs such as ChatGpt and Gemini, is rapidly emerging around the world, driven by both casual and institutional research. Congressional testimony by Openai CEO Sam Altman highlights how this relationship is. “The cost of intelligence, the cost of AI converges to the cost of energy.
“While AI’s energy demand is a key issue, there are also opportunities to leverage these vast computing power to contribute to climate change solutions,” said Evelyn Wang, Vice President of Energy and Climate and former director of the U.S. Department of Energy’s Advanced Research Projects Agency-Energy (ARPA-E).
Wang also said innovations developed for data centers such as AI and data centers (efficiency, cooling technology, clean power solutions) could have a wide range of applications beyond the computing facility itself.
Clean Energy Solution Strategy
The symposium explored multiple pathways to address the challenges of AI-Energy. Some panelists have presented models suggesting that artificial intelligence could potentially increase emissions in the short term, but that its optimization capabilities could allow for significant emission reductions through more efficient power systems from 2030 onwards, allowing for accelerated clean technology development.
According to Emre Gençer, co-founder and CEO of SESAME Sustainability and former leading research scientist at Mitei, the research shows the costs of powering and powering computing centers. Gençer’s analysis revealed that the central US offers rather low costs due to complementary solar and wind power. However, achieving zero emissions is 2-3 times higher for large-scale battery deployments (5-10 times higher than a medium carbon scenario).
“If you want to perform zero emissions with reliable electricity, you need technology other than renewable energy and batteries. He pointed out “long-term storage technology, small modular reactors, geothermal or hybrid approaches” as the necessary complement.
The growing interest in nuclear energy is growing due to the energy needs of data centers, Kathryn Biegel, R&D manager and corporate strategy manager at Constellation Energy, added that nuclear reactors are reopening at the previous three-mile island site, now known as the Crane Clean Energy Centre. “Datacenter space has become a major priority for the constellations,” she said, highlighting how the need for both reliability and carbon-free electricity is reshaping the electricity industry.
Can AI accelerate energy transitions?
According to Priya Dunti, Silverman Family Career Development Professor in the Department of Electrical Engineering and Computer Science at MIT, artificial intelligence could dramatically improve the power system, according to the Information and Decision Systems lab. She showed how AI can accelerate power grid optimization by embedding physics-based constraints in neural networks and solving complex power flow problems at “more than 10 times faster than traditional models.”
AI is already reducing carbon emissions, according to an example shared by Antonia Gawel, global director of sustainability and partnerships at Google. Google Maps’ fuel-efficient routing capabilities have helped prevent “an emission reduction of over 2.9 million tonnes of GHG (greenhouse gas) emissions. Another Google Research Project uses artificial intelligence to help pilots avoid creating cost rails that account for around 1% of the impact of global warming.
The possibilities of AI to speed up material discovery in Power applications were highlighted by Rafael Gómez-Bombarelli, Associate Professor of Development of Paul M. Cook, Faculty of Materials Science and Engineering, MIT. “AI spived models can be trained to move from structure to property,” he pointed out, allowing for the development of materials that are both important for computing and efficiency.
Ensuring growth through sustainability
Through the symposium, participants worked to balance the rapid deployment of AI with environmental impacts. AI training is the most popular, but Dustin Demetriou, a senior technical staff member at IBM’s sustainability and data center innovation, cited an article from the World Economic Forum that suggests that “80% of the environmental footprint is estimated to be due to inference.” Demetriou highlighted the need for efficiency across all artificial intelligence applications.
Jevons’ paradox is another factor to consider, “there is a tendency to increase overall resource consumption rather than decrease efficiency,” warned Emmastlebell, a professor of Largeledia’s Institute of Language Technology, the Faculty of Computer Science at Carnegie Mellon University. Strubell advocated displaying computing center electricity as a limited resource that requires thoughtful allocation in a variety of applications.
Several presenters discussed new approaches to integrating sources of renewable energy with existing grid infrastructure, including potential hybrid solutions combining clean facilities with existing natural gas plants with existing valuable grid connections. These approaches could provide substantial clean capacity across the US at reasonable costs, while minimizing the impact of reliability.
Navigating the AI Energy Paradox
The symposium highlighted the central role of MIT in developing solutions for the AI-Electricity Challenge.
Green spoke about the new MITEI program on computing centers, power and calculations that operate with the comprehensive spread of MIT Climate Project Research. “We try to tackle extremely complex issues from the power source throughout, through real algorithms that provide value to our customers in a way that is accepted by all stakeholders and actually meet all our needs.”
Participants at the symposium were examined for prioritization of MIT research by Randall Field, Director of Research at MITEI. The real-time results ranked “data center and grid integration issues” as their top priority, followed by “AI for accelerated discovery of advanced materials in energy.”
Additionally, attendees revealed that they mostly see the possibilities of AI that AI sees as a “promise” rather than a “danger,” but for a considerable part it remains uncertain about the ultimate impact. When asked about the priorities of power supply for computing facilities, half of respondents followed reliability and cost and chose carbon intensity as their primary concern.
