An expert on environmental policy measured every drop of water he used during months of heavy AI work. The findings reveal we may be worrying about the wrong environmental crisis.
(Photo: Craig Hastings / Getty)
Published April 6, 2026 02:14PM
People are worried about Artificial Intelligence (AI) and its use of water. If you care about the Colorado River, if you have watched Lake Powell’s water level drop and Lake Mead shrink, and have felt the dread of living in a place that is running out of its most essential resource, then hearing that a new technology is guzzling water hits a nerve. It should. The instinct to protect what is disappearing is a good one. But it turns out that AI isn’t as dire a threat to our water as people may think.
I work on the Colorado River water for a living as a filmmaker and storyteller. I have a PhD in engineering and public policy. I am Diné. The threats to the river are not abstract to me; they are very real. So earlier this year, I decided to quantify something that has been missing in the conversation about AI and water: I measured my own AI water use.
For 11 weeks, I tracked all of my AI use. One hundred sessions. I counted the tokens processed and applied publicly available numbers on per-token energy and water intensity from Epoch AI and operator-reported data from Microsoft and Google. Anyone can run this math.
In those 11 weeks, I built an iOS app from scratch and wrote policy briefs on extreme heat for nonprofits I work with. I produced documentary pitch decks and drafted a 15,000-word climate fiction piece about the Colorado River collapse. I used AI every single day, often for hours at a time.
Total lifecycle water footprint of all that work: about five gallons. That accounts for everything: the water used to cool the data centers, the water consumed at power plants to generate the electricity, and the water embedded in manufacturing the hardware.
When an Outside editor reached out to ask me to write this story, I was on a trip to Marble Canyon, Arizona, to train raft guide companies on what is happening with the river. I drove my diesel Sprinter van from Tucson to the site, which tallied 383 miles at 20 miles per gallon of gasoline. When I ran the numbers later, the lifecycle water footprint of my fuel was around 110 gallons. One drive to the work I do on the Colorado River used more than 20 times the water of everything I did with AI in 11 weeks. That comparison stopped me cold—and I study this for a living.
You may have read stories about how data centers use lots of water, and how these massive warehouses of computer servers are being built across the country to help with the expanding use of AI. Here is the part that I think gets lost in the discourse. All U.S. data centers combined—not just AI, all of them—account for roughly 0.3 percent of total national water withdrawals. Agriculture consumes approximately 80 percent of Colorado River water. In my home state of Arizona, agriculture is 86 percent of the state’s water use. These are not competing concerns on the same scale. They are separated by orders of magnitude.
I know what the next question is, because I get it every time: Sure, but AI is growing exponentially. Will it not eventually become the problem? It is a fair question, and it deserves a real answer.
The evidence says no. Inference efficiency, meaning how much energy it takes to actually answer a single query, is improving dramatically. A 2025 Microsoft Research paper found that combined advances in hardware, software, and model architecture can deliver 8 to 20 times reductions in energy per query. The cost of running AI systems comparable to GPT-3.5 dropped more than 280 times between late 2022 and late 2024. Hardware efficiency gains are running at about 40 percent per year. And AI companies have an enormous financial incentive to keep pushing efficiency.
Electricity is one of the biggest line items on their balance sheets. They are not going to burn more power than they have to. Even the International Energy Agency’s 2025 Energy and AI report projects data centers will account for roughly three percent of global electricity by 2030. That is worth monitoring. It is not the crisis.
It is also worth understanding where AI’s water footprint actually comes from. Most of it is not water running through a data center’s cooling towers. Most of the water used for AI is from generating the electricity that powers the servers. That is scope-2 water, the water consumed at power plants through evaporative cooling and steam generation. We do not hold this against any other electricity consumer. Nobody is calculating the water footprint of your refrigerator or your electric car or the subway.
But when AI uses that same grid electricity, suddenly it is labeled a water crisis. The water is real. The inconsistency in how we talk about it is also real.
The current crisis facing the Colorado River is the same one it’s faced for 100 years. The Colorado River was divided up in 1922 based on flow measurements that, as it turned out, were calculated from some of the wettest years on record. We over-allocated water use from a river we overestimated, and then the climate started warming.
Streamflow has dropped roughly 20 percent since 2000. The math was never going to work. It is not working now. And the 2026 Compact renegotiation, the most consequential water policy event in a century, is happening right now with a fraction of the public attention that a viral video about ChatGPT’s water use receives. Alfalfa irrigation in California’s Imperial Valley alone consumes over 800 billion gallons a year. That is where the water is going. That is what needs your attention.
I am not asking anyone to stop caring about AI’s water use; I am asking you to instead focus on the scale of water use by all of the industries that rely on the Colorado River. At the local level, a single data center can stress a small community’s water supply, and that is worth watching. But when the broader discourse frames AI as the driver of the Western water crisis, it pulls focus from the systems that actually drain the river. The river needs you. It just needs you pointed at the 80 percent, not the 0.3.
How I Calculated My Research
Water withdrawal and use data come from the U.S. Geological Survey’s 2021 national water use estimates and the Bureau of Reclamation’s Colorado River accounting. Per-token energy intensity is drawn from Epoch AI and Lin (2025), with water use derived from operator-reported Water Usage Effectiveness data published by Microsoft and Google. The 0.3 percent data center figure is consistent with Lawrence Berkeley National Laboratory’s 2024 U.S. data center energy report. Inference efficiency projections reference a 2025 Microsoft Research paper on AI inference energy pathways (Oviedo et al.). The IEA’s base case projection for global data center electricity demand comes from its 2025 Energy and AI special report. The diesel lifecycle water footprint is calculated using Argonne National Laboratory’s GREET model. Colorado River over-allocation history and streamflow decline data are drawn from Bureau of Reclamation records and peer-reviewed hydrology research, including Udall and Overpeck (2017) on Colorado River flow loss and Milly and Dunne (2020) on evapotranspiration trends.
