Climate Action

The True Price of Every ChatGPT Prompt

Each time someone asks ChatGPT a question, the exchange feels nearly invisible: just text appearing on a screen. But behind that response lies a massive physical infrastructure drawing electricity, consuming water, and producing carbon emissions at an unprecedented scale. As generative artificial intelligence becomes embedded in everyday life, the environmental footprint of the system powering it is growing just as rapidly.

OpenAI has acknowledged that users now send roughly 25 billion prompts per day to ChatGPT. Each prompt requires computation inside energy-intensive data centers, where servers run continuously to process, store, and generate responses. While a single query may seem insignificant, the cumulative demand of billions of interactions translates into real and measurable environmental costs.

While a single AI query may feel insignificant, everyday use adds up quickly at scale. A typical office worker might use ChatGPT about 20 times a day to summarize a meeting, draft emails, brainstorm ideas, and outline a report. Each prompt uses an estimated 0.34 watt-hours of electricity — roughly the amount of energy needed to run a standard LED lightbulb for about two minutes. Over a day, that adds up to about 6.8 watt-hours per person. 

On its own, 6.8 watt-hours per day is minimal, but scaled to one million daily users, that becomes 6,800 kilowatt-hours — enough to power roughly 225 U.S. homes for a full day. At 100 million users, the number jumps to 680,000 kilowatt-hours daily, comparable to the electricity consumption of more than 22,000 households. Small individual actions, multiplied globally, begin to resemble the energy footprint of the entire community. 

Data Centers and the AI Boom

Generative AI relies heavily on hyperscale data centers, the largest category of computing facilities in the world. As of 2024, there were 1,136 hyperscale data centers globally, with the United States accounting for 54% of global capacity. These facilities are expanding in both size and power density, driven largely by AI workloads that require constant, high-performance computing. 

The environmental implications are already visible. At more than 4%of total U.S. electricity consumption, data centers now use roughly as much power as all residential lighting nationwide — and more electricity than many individual states consume in total. Projections cited in the same analysis suggest that figure could rise between 6.7% and 12% by 2028 as AI systems continue to scale.

This surge in electricity demand presents a challenge for both U.S. national climate targets and global climate goals under the Paris Agreement, which depend on rapidly reducing fossil fuel use even as electricity demand grows. While renewable energy capacity is expanding, much of the U.S. grid still relies on fossil fuels. As a result, increased electricity use by data centers often corresponds directly with higher greenhouse gas emissions.

Carbon Emissions at Scale

The climate impact of AI infrastructure is no longer speculative. A roadmap study from Cornell University estimates that AI-driven data center expansion could generate between 24 and 44 million metric tons of carbon dioxide emissions annually by 2030. That range is comparable to adding millions of gasoline-powered vehicles to U.S. roads each year.

The emissions stem from multiple sources: the electricity required to run servers, the energy used for cooling, and the upstream carbon footprint of power generation. According to the U.S. Energy Information Administration, fossil-fuel power plants also consume substantial amounts of water, linking carbon emissions and water use in ways that amplify environmental stress.

PULLQUOTE: “In recent years advances in AI systems and services have largely been driven by a race for size and scale, demanding increasing amounts of computational power — and generally without much regard for resource efficiency.” – Prof. Tom Rodden, University of Nottingham, quoted in The Guardian on AI data center energy and water use

AI’s Thirst for Water

While electricity consumption often dominates discussions about AI’s climate impact, water use is an equally pressing concern. Data centers rely on water-based cooling systems to prevent servers from overheating. Research summarized by MIT Technology Review shows that AI servers operating within standard “cool” temperature ranges — typically between 18°C and 27°C (64-81°F) — can require one to two liters of water per kilowatt-hour of electricity consumed, depending on system design and local climate conditions. At one to two liters per kilowatt-hour, generating the electricity for a single AI-heavy household’s daily energy use can require the equivalent of an entire person’s daily drinking water — just to keep servers cool.  

In 2023, U.S. data centers consumed an estimated 17 billion gallons of water. Given that the average American uses between 30,000 to 36,500 gallons of water annually, that volume could meet the full yearly water needs of roughly half a million people. In effect, data centers now “drink” as much water each year as a mid-sized American city. 

The consequences are especially severe in arid regions. A Bloomberg analysis found that many new AI-driven data centers are being built in already water-stressed areas, including the American Southwest. In Reno, Nevada, a growing data center hub, climate assessments show high long-term drought risk, raising concerns about the sustainability of continued industrial water use — and putting residents at greater risk of water restrictions, rising utility costs, and heightened vulnerability during prolonged drought conditions.

Local investigations have already documented these pressures. Reporting by The New York Times found that a Meta data center in Georgia uses roughly 500,000 gallons of water per day — enough to supply the daily water needs of several thousand residents. Similar conflicts are emerging nationwide as communities grapple with competing demands for limited freshwater.

Transparency Gaps and Policy Challenges

Despite the scale of energy and water consumption, reporting requirements remain limited. Most technology companies do not publicly disclose facility-level data on water withdrawals or cooling practices. Experts cited by The Guardian have warned that the lack of mandatory reporting for data center energy and water use makes it difficult for regulators and communities to assess environmental risks or plan for infrastructure strain.

Without transparency, local governments may approve new facilities without fully understanding long-term impacts on water systems, electricity prices, or emissions targets.

Paths Toward Sustainable AI

There are viable ways to reduce AI’s environmental footprint. Research from Cornell indicates that strategic siting of data centers, renewable energy integration, and advanced cooling technologies could significantly lower emissions and water use if adopted at scale. In some scenarios, improved cooling efficiency alone could reduce water consumption by nearly a third.

The United Nations has repeatedly emphasized electrification — shifting cars from gasoline to electric vehicles, replacing gas heating with electric systems, and digitizing infrastructure — must be paired with rapid expansion of renewable energy sources like wind and solar. Without decarbonizing the grid at the same pace that electricity demand grows, increased digital and AI infrastructure risks locking in higher emissions rather than reducing them. 

Innovation With Accountability

AI systems like ChatGPT offer real social benefits, from education and accessibility to research and communication. But as adoption accelerates, so does the responsibility to ensure that innovation does not come at the expense of climate stability and water security. 

EARTHDAY.ORG has long focused on making environmental costs visible. The infrastructure behind AI is no exception. As billions of daily prompts translate into rising energy use, water withdrawals, and carbon emissions, the future of AI will depend not just on technological advancement, but on whether its growth aligns with the planet’s ecological limits.

Powering AI Without Polluting the Planet

You can help ensure that all energy infrastructure — including the power behind AI — is clean, renewable and responsibly built.

Sign the Renewable Energy Petition urging world leaders to triple renewable energy generation by 2030 — a benchmark climate scientists say is essential to meeting global emissions targets. Past public pressure has helped accelerate renewable commitments at both national and corporate levels, and continued engagement remains critical.
In the U.S.? You can also send a message to your local lawmakers urging them to invest in renewable energy and grid modernization so that digital innovation doesn’t deepen the climate harm.


This article is available for republishing on your website, newsletter, magazine, newspaper, or blog. The accompanying imagery is cleared for use with attribution. Please ensure that the author’s name and their affiliation with EARTHDAY.ORG are credited. Kindly inform us if you republish so we can acknowledge, tag, or repost your content. You may notify us via email at [email protected]. Want more articles? Follow us on substack.

Tags: