
Confronting the AI/energy conundrum
The burgeoning landscape of AI-powered computing centers is sparking an unprecedented surge in electricity demand, casting a shadow over global power grids and ambitious climate goals. Paradoxically, artificial intelligence technologies also hold the potential to revolutionize energy systems, potentially accelerating the transition to clean power. This complex dilemma was the central theme of the MIT Energy Initiative (MITEI) Spring Symposium, titled “AI and energy: Peril and promise,” held on May 13.
As William H. Green, director of MITEI and Hoyt C. Hottel Professor in the MIT Department of Chemical Engineering, articulated at the symposium, “We’re at a cusp of potentially gigantic change throughout the economy.” The event convened leading experts from industry, academia, and government to dissect this dual challenge: addressing “local problems with electric supply and meeting our clean energy targets” while striving to “reap the benefits of AI without some of the harms.” MITEI has prioritized research into data center energy demand and the potential benefits of AI for the energy transition.
AI’s Escalating Energy Footprint
The symposium opened with startling statistics underscoring AI’s ravenous appetite for electricity. After decades of stagnant electricity demand in the United States, computing centers now consume approximately 4 percent of the nation’s total electricity. Projections, though subject to uncertainty, suggest this demand could skyrocket to 12-15 percent by 2030, predominantly driven by artificial intelligence applications.
Vijay Gadepally, a senior scientist at MIT’s Lincoln Laboratory, vividly illustrated the scale of AI’s consumption. “The power required for sustaining some of these large models is doubling almost every three months,” he noted, adding that “A single ChatGPT conversation uses as much electricity as charging your phone, and generating an image consumes about a bottle of water for cooling.” The rapid emergence of facilities requiring 50 to 100 megawatts of power across the US and globally underscores this trend, fueled by both casual and institutional reliance on large language models like ChatGPT and Gemini. Sam Altman, CEO of OpenAI, in his congressional testimony, underscored the foundational relationship between AI and energy, stating: “The cost of intelligence, the cost of AI, will converge to the cost of energy.”
Evelyn Wang, MIT vice president for energy and climate and former director at the Advanced Research Projects Agency-Energy (ARPA-E) at the U.S. Department of Energy, highlighted the silver lining: “The energy demands of AI are a significant challenge, but we also have an opportunity to harness these vast computational capabilities to contribute to climate change solutions.” Wang emphasized that innovations developed for AI and data centers—such as advancements in efficiency, cooling technologies, and clean-power solutions—could find broad applications beyond computing facilities, fostering wider energy system improvements.
Charting Pathways to Clean Energy
The symposium delved into various strategies to tackle the AI-energy challenge. Some models presented by panelists suggested that while AI might initially increase emissions, its optimization capabilities could lead to substantial emissions reductions post-2030 through more efficient power systems and accelerated clean technology development.
Emre Gençer, co-founder and CEO of Sesame Sustainability and a former MITEI principal research scientist, shared insights into regional variations in the cost of powering computing centers with clean electricity. His analysis indicated that the central United States offers considerably lower costs due to its complementary solar and wind resources. However, achieving truly zero-emission power would necessitate massive battery deployments—five to 10 times more than moderate carbon scenarios—which would drive costs two to three times higher. “If we want to do zero emissions with reliable power, we need technologies other than renewables and batteries, which will be too expensive,” Gençer concluded, pointing to the need for “long-duration storage technologies, small modular reactors, geothermal, or hybrid approaches.”
The soaring energy demands from data centers have rekindled interest in nuclear power. Kathryn Biegel, manager of R&D and corporate strategy at Constellation Energy, noted that her company is restarting the reactor at the former Three Mile Island site, now rebranded as the “Crane Clean Energy Center,” specifically to meet this burgeoning demand. Biegel emphasized that the data center space has become a “major, major priority for Constellation,” indicating how the industry’s need for both reliability and carbon-free electricity is reshaping the power sector.
AI as a Catalyst for Energy Transition
Artificial intelligence is poised to dramatically enhance power systems, according to Priya Donti, assistant professor and the Silverman Family Career Development Professor in MIT’s Department of Electrical Engineering and Computer Science and the Laboratory for Information and Decision Systems. She demonstrated how AI can accelerate power grid optimization by embedding physics-based constraints into neural networks, potentially solving complex power flow problems at “10 times, or even greater, speed compared to your traditional models.”
Concrete examples of AI’s current carbon emission reductions were shared by Antonia Gawel, global director of sustainability and partnerships at Google. Google Maps’ fuel-efficient routing feature, for instance, has “helped to prevent more than 2.9 million metric tons of GHG [greenhouse gas] emissions reductions since launch, which is the equivalent of taking 650,000 fuel-based cars off the road for a year.” Another Google research project utilizes AI to help pilots avoid creating contrails, which contribute about 1 percent of global warming impact.
The profound potential of AI to expedite materials discovery for power applications was highlighted by Rafael Gómez-Bombarelli, the Paul M. Cook Career Development Associate Professor in the MIT Department of Materials Science and Engineering. He noted that “AI-supervised models can be trained to go from structure to property,” enabling the rapid development of novel materials crucial for both advanced computing and improved energy efficiency.
Ensuring Sustainable Growth
A recurring theme throughout the symposium was the delicate balance required between rapid AI deployment and its environmental impacts. While AI training often garners the most attention, Dustin Demetriou, senior technical staff member in sustainability and data center innovation at IBM, cited a World Economic Forum article suggesting that “80 percent of the environmental footprint is estimated to be due to inferencing.” Demetriou stressed the critical need for efficiency across all artificial intelligence applications, not just during training phases.
Emma Strubell, the Raj Reddy Assistant Professor in the Language Technologies Institute at Carnegie Mellon University, cautioned against Jevons’ paradox, where “efficiency gains tend to increase overall resource consumption rather than decrease it.” Strubell advocated for treating computing center electricity as a finite resource, necessitating thoughtful allocation across diverse AI applications.
Several presenters also explored innovative approaches for integrating renewable sources with existing grid infrastructure, including potential hybrid solutions. These could involve combining clean energy installations with existing natural gas plants that already possess valuable grid connections, offering a pathway to substantial clean capacity across the United States at reasonable costs while minimizing reliability impacts.
Navigating the AI-Energy Paradox
The symposium underscored MIT’s pivotal role in developing comprehensive solutions to the AI-electricity challenge. William H. Green articulated the vision for a new MITEI program focused on computing centers, power, and computation, designed to operate in conjunction with the broader MIT Climate Project research. “We’re going to try to tackle a very complicated problem all the way from the power sources through the actual algorithms that deliver value to the customers — in a way that’s going to be acceptable to all the stakeholders and really meet all the needs,” Green stated.
A real-time poll of symposium participants, conducted by Randall Field, MITEI director of research, revealed key priorities for MIT’s ongoing research. “Data center and grid integration issues” emerged as the top priority, closely followed by “AI for accelerated discovery of advanced materials for energy.” Furthermore, attendees largely viewed AI’s potential regarding power as a “promise,” rather than a “peril,” though a notable portion remained uncertain about its ultimate impact. When asked about priorities in power supply for computing facilities, half of the respondents selected carbon intensity as their primary concern, with reliability and cost as subsequent considerations.



