The article explains that every time you use an AI tool like ChatGPT, water is being consumed somewhere in the process—but estimates of how much vary wildly depending on how the calculation is done. A Morgan Stanley report projects AI data centers could consume about 1,068 billion liters of water annually by 2028, a figure that includes all the water tied to electricity generation, cooling systems, and manufacturing processes behind AI operations, compared with roughly 243,000 liters per American per year. Yet OpenAI’s CEO Sam Altman has claimed an average ChatGPT query uses only about 0.000085 gallons (roughly a fifteenth of a teaspoon) of water, a number that tracks just the immediate processing and not the broader lifecycle. Experts say both numbers can be “technically correct” depending on the scope—whether you count lifecycle water use or just the direct cooling during a query—and that this variance highlights how complex and opaque the true water footprint of AI systems remains.

Recent news