In this photo illustration logo of 'Gemini' is seen on a mobile screen on July 2, 2025.

For those observing the trend, the increasing reliance on AI for internet queries presents a significant concern. A growing number of individuals are using ChatGPT and similar services for basic inquiries, and even standard Google searches now incorporate AI-generated responses.

Estimates for the energy consumption and associated climate impact of an AI query vary considerably. For instance, ChatGPT is reported to use up to 0.34 watt-hours per prompt, equivalent to powering a household lightbulb for 20 seconds. Conversely, some researchers have indicated that certain models could consume up to 100 times more energy for longer prompts.

On Thursday, Google released its own figures, stating that the average search utilizing Gemini—the company’s widely used AI tool—consumes 0.24 watt-hours. This is comparable to watching approximately nine seconds of television. Each such search generates 0.03 grams of carbon dioxide equivalent. Notably, Google indicates that Gemini’s text queries have become more environmentally friendly over time. The company reported that over the past year, energy consumption per query has dropped by approximately 97%, and carbon emissions per query have decreased by 98%. Furthermore, a separate report from Google published earlier in the summer highlighted a separation between data center energy consumption and the emissions produced. (It is important to acknowledge that simple text queries are less resource-intensive than other functionalities such as image, audio, or visual generation, and these statistics do not encompass model training, figures that Google’s report omits due to difficulties in precise calculation).

The continuation of this downward trend is a vital inquiry for those monitoring the future of energy and climate in the U.S. This has significant implications not only for future U.S. emissions but also for the substantial investments—hundreds of billions of dollars—in the power sector. Leaders across various interconnected industries will face the challenge of balancing the increasing demand for AI with the need to prevent excessive infrastructure development as AI models become more efficient.

Google’s advancements are primarily attributable to two key factors: the use of cleaner power sources and more efficient chips combined with optimized query processing.

The clean energy strategy employed is notable, yet relatively simple in concept. The company secures substantial amounts of renewable energy for its operations, having signed agreements last year to purchase 8GW of clean power. This capacity is equivalent to that of 2,400 utility-scale wind turbines, based on Department of Energy figures. Looking ahead, Google has also invested in supporting the development of other emerging clean technologies, such as nuclear fusion.

Beyond clean energy, the company also implements significant efficiency measures. While in energy contexts, efficiency typically implies using less energy and enhancing hardware productivity—like with climate control or improved insulation—Google’s most notable efficiency improvements stem from within the AI ecosystem itself, rather than solely the energy system. The company has developed its own custom chips, known as TPUs, as an alternative to the more commonly used GPUs. These TPUs have progressively become more efficient, achieving approximately a 30-fold increase in efficiency since 2018, according to Google’s sustainability report. Additionally, Google has enhanced the efficiency of its models through innovative query processing techniques, which reduce the required computational power. Just weeks prior, the company also unveiled a program designed to redirect data center demand to periods when the electricity grid experiences lower stress.

A pivotal question for Google, and indeed for any company heavily invested in AI, is whether these initiatives and the resultant efficiency improvements can be sustained. Substantial efficiency gains would represent a significant victory for climate efforts, provided that the escalating usage does not nullify the advancements in efficiency.

Increased efficiency would also carry profound implications throughout the energy sector. Currently, power companies are heavily investing in new electricity generation sources, predicated on the belief that AI will perpetually fuel demand growth. However, accurately forecasting the exact pace of this demand growth remains challenging. Anticipated efficiency improvements are a major factor contributing to this uncertainty, and Google’s findings should prompt a moment of reflection on this “known unknown” potential.

To receive this article directly in your inbox, consider subscribing to the TIME CO2 Leadership Report newsletter.