[ ENVIRONMENT ]
· 6 min read
2.4 gallons of water per AI email
A single AI-generated email uses 2.4 gallons of water for cooling. By 2028, AI data centers could consume 300 TWh of electricity. Nobody is publishing sustainability reports.

The water you can't see
Generating a single 100-word AI email consumes approximately 2.4 gallons of water, used primarily to cool the data center hardware running the inference. Scale that to hundreds of millions of daily queries across all major AI services, and the water consumption reaches into billions of gallons per year. These data centers are frequently located in regions already facing water stress, competing directly with agricultural and residential water needs. The convenience of asking an AI to draft your email comes with a physical cost that never appears on your screen.
Electricity consumption projections
Industry analysts project that AI data centers could consume 300 terawatt-hours of electricity annually by 2028. To put that in perspective, that figure exceeds the total electricity consumption of many mid-sized countries. The rapid scaling of model sizes and inference volumes is driving demand that existing power grids were never designed to handle. Every new model generation requires more compute for training and more energy for serving, with no sign of this trajectory flattening.
The missing sustainability reports
Not a single major AI company has published a comprehensive sustainability report that accounts for the full environmental impact of their AI operations. Google, Microsoft, and Meta all publish corporate sustainability reports, but the AI-specific energy and water figures are either absent, aggregated beyond usefulness, or framed in ways that obscure the true scale. When companies that pride themselves on transparency refuse to disclose the environmental cost of their fastest-growing product category, the silence itself is informative.
The nuclear plant race
Major tech companies are now acquiring or contracting with nuclear power plants to feed their data center expansion. Microsoft signed a deal to restart a unit at Three Mile Island. Amazon has invested in nuclear energy projects. Google has explored small modular reactors. The fact that the world's largest technology companies need dedicated nuclear facilities to power their AI ambitions should clarify the scale of energy consumption involved. These are not incremental infrastructure decisions. They are industrial-scale commitments driven by demand that renewable energy alone cannot yet satisfy.
How SecureGPT approaches infrastructure differently
SecureGPT runs on trusted, eco-friendly servers located in Canada and the EU, where energy grids have significantly higher renewable content. Our architecture is designed around a minimal compute footprint: the server processes your request and discards it, with no massive training runs, no persistent data storage, and no sprawling GPU clusters. We use efficient open-source models that deliver strong performance without the environmental overhead of the largest proprietary systems. Reducing your AI footprint does not require giving up capability. It requires making deliberate infrastructure choices.