An MIT Tech Review investigation found that the common understanding of AI’s energy consumption is full of holes. Starting with an assessment of how much energy a single typical LLM query costs and the fact that these queries are now being built into countless applications, the researchers conclude that “the energy resources required to power this AI revolution are staggering.” By 2028, AI could consume as much electricity as 22 percent of all US households, and data centers are expected to continue trending toward using dirtier, more carbon intensive forms of energy. The companies building and deploying AI models are not being transparent enough about exactly how much these models use and what sort of energy sources will power AI’s future. The head of Hitachi Energy, the world’s largest transformer maker, is urging governments to rein in Big Tech’s AI-related energy use in order to maintain stable supplies: “No user from an industry point of view would be allowed to have this kind of behaviour.”
In a recent blog post, Sam Altman revealed questionable numbers for OpenAI’s water and electricity usage. Without citing any sources or evidence, the CEO claimed a single query requires the same energy as what a high-efficiency lightbulb would use in a couple of minutes and roughly one-fifteenth of a teaspoon of water. This estimate is suspect since a University of California study from late 2023 found that GPT-3 – an older and less energy intensive version of ChatGPT - used 31 million gallons of water per day. (Altman’s calculations would translate to 31 million gallons per year.) Regardless of accuracy, experts say Altman’s figure is not meaningful absent the context of how it was calculated. Per Sasha Luccioni, the climate lead at Hugging Face, “He [Altman] could have pulled that out of his ass.”
Taking a different tack, Google has for the first time released more detailed data on the environmental impact of an average Gemini text query: the median prompt consumes 0.24 watt-hours of electricity, consumes 0.26 milliliters of water, and emits 0.03 grams of carbon dioxide. (Note that these figures do not apply to image or video generation queries.) Google’s report also included a broad look at the company’s AI energy demand, including the power to run AI chips and other infrastructure needed to support that hardware. The report represents a major step forward in AI-related environmental disclosures, but it crucially omits the total number of queries Gemini gets each day, which would allow for estimates of its total energy demand.
A recent Accenture report proposes a new metric for measuring AI’s environmental impact. Using tokens as a standardized unit of performance, the Sustainable AI Quotient tracks how efficiently AI usage transforms money, energy, and emissions into measurable performance. The report concludes that the way AI is currently being scaled is not sustainable.
Questions to consider
How are companies calculating the emissions from their AI development and use? Are they working with a per-query estimate? How are they classifying different uses across Scope 1, Scope 2, and Scope 3?
How are companies assessing the environmental and social risks of their AI supply chains? Are they assessing risks to local communities? What steps are being taken to mitigate these risks?


