The AI Compass
While Artificial Intelligence often exists in the "cloud," its impact is very physical. From massive data centers to global energy grids, AI relies on significant natural resources to function.
AI consumes energy in two main phases: Training and Inference. Training involves feeding algorithms massive datasets to "teach" them. Inference is what happens when you use the AI to generate answers or images.
Energy Cost: Standard Search vs. Gen AI
*Estimates based on varying research studies comparing keyword search to LLM inference.
Data centers generate incredible amounts of heat. To prevent the servers from melting, they must be cooled constantly. While some use air cooling, many high-performance AI centers use water cooling towers.
This process consumes "potable" (drinkable) water. Estimates suggest that a simple conversation of 20-50 questions with a large chatbot "drinks" about a 500ml bottle of water due to evaporation in cooling towers.
The carbon impact of AI depends heavily on where the computation happens. If a data center is located in a region powered by coal or gas, the AI's carbon footprint is high. Tech companies are currently racing to build "Green AI" by optimizing code to be more efficient.
To learn more about the physical impact of digital tools, explore these resources: