Google Releases First-Ever Data on AI Prompt Energy Consumption

Google Releases First-Ever Data on AI Prompt Energy Consumption

Google has released a technical report on the energy consumption of its Gemini apps per query. A median prompt uses 0.24 watt-hours, akin to running a microwave for one second. The report also provides average estimates for water usage and carbon emissions per text prompt. This is a significant transparency move for a major tech company and offers detailed insights into their energy calculations. Efforts are increasing to comprehend AI’s energy impact, but direct measurement has been difficult without access to operations by key tech players.

Earlier this year, MIT Technology Review published content on AI and energy, during which no major AI companies shared per-prompt energy usage. Google’s new report offers long-awaited insights. It considers energy demands across AI chips and supportive infrastructure. According to Google’s chief scientist Jeff Dean, AI chips, specifically Google’s custom TPUs, make up 58% of the 0.24 watt-hour total electric demand. Additional energy is used by the system’s CPU and memory (25%), idle machines for backup (10%), and data center overhead (8%).

The report, according to Mosharaf Chowdhury from the University of Michigan, is vital for AI and energy research as it provides data only companies can procure due to their operational scale and detailed facilities. Despite its transparency, Google’s figure is not comprehensive of all Gemini queries as energy demand can vary greatly depending on the complexity of the task. For example, large tasks like summarizing multiple books demand more energy.

The report specifically analyzes text prompts, excluding image or video generation, which usually consumes more energy. Google notes a significant reduction in the energy required for Gemini queries over time—energy demand has reduced by 33 times between May 2024 and May 2025. Google credits this improvement to software optimization and advancements in models.

For carbon emissions, Google estimates median prompt emissions at 0.03 grams of CO2, based on energy usage and emissions per electricity unit. Rather than using a generic grid average, Google uses a market-based estimate corresponding to its clean energy purchases, reducing emissions per unit compared to average regional grids. AI centers use water for cooling, and Google estimates each prompt uses 0.26 milliliters, about five drops.

The goal is to provide users insight into AI’s energy impact, says Dean, asserting energy and water usage for Gemini models is minimal, similar to everyday activities. The report expands public knowledge of AI resource usage amid increased calls for transparency about technology’s energy demands. Sasha Luccioni, AI and climate researcher, praises Google’s efforts but notes the need for standardized AI energy ratings, urging further disclosure, such as the total number of daily Gemini queries, for complete assessments.

Leave a Reply

Your email address will not be published. Required fields are marked *