Every day, 2.5 billion messages are sent to ChatGPT, consuming enough energy to power a lightbulb for 9,500 years. And that’s not counting the other artificial intelligence (AI) bots that are rising in popularity: Google Gemini, Claude and Perplexity, to name a few. AI is inescapable in 2025, and has recently become a part of the resources available to Menlo students.
Menlo’s AI assembly last month focused mainly on academic integrity, but bringing up its environmental impact is just as important. AI’s growing use comes at a steep environmental cost in both energy and water consumption, which students must recognize to use AI responsibly.
Training large language models (LLMs) like ChatGPT or Google Gemini requires a substantial amount of energy, even before they become available to the public. According to a study conducted by MIT reporters James O’Donnell and Casey Crownhart, it’s estimated that GPT-4 consumed 50 gigawatt-hours during its training, which is enough to power the entire city of San Francisco for three days.
Despite this, it’s the customers’ usage of these LLMs that makes up an estimated 80-90 percent of all AI energy consumption. The popularity of AI is currently outpacing the expansion of electricity grids and infrastructure. Over 70 percent of the electric grid is over 25 years old; the aging system can’t keep up with the demand for increased power required for AI. The grid, a network that moves electricity from power plants to people, wastes energy and increases emissions.
Electricity isn’t the only thing AI consumes; it also uses a significant amount of water. Have you ever opened too many tabs on your computer? The fans will blast to stop it from overheating. Well, the same thing occurs with AI: the more questions it solves at once, the hotter the server becomes, so AI data centers must cool their servers to maintain smooth operation.
These centers use millions of liters of water a day for server cooling. Big companies like Google and Meta used approximately 580 billion gallons of water to cool AI centers in 2022 alone. A solution for this would be to reuse the water; however, many data centers already do. Roughly 80 percent of the recycled water is lost in the process due to evaporation.
This water doesn’t just appear out of nowhere; it’s sourced from people who need it to live. Two-thirds of new AI data centers are built in areas prone to drought. For example, in Arizona and Texas, data centers have diverted scarce water resources away from agricultural use. I don’t know about all of you, but personally, if I had to choose between asking AI a question and being able to have food, I’d prefer the food.
So, are we going to stop using AI completely and hold hands and sing Kumbaya? Of course not. But you must be aware of AI’s impact to interact with it only when needed. Here’s a simple tip that we, as Menlo students, can use to do our part. For easily researchable questions, just use Google. You do not need to use AI to answer simple questions; by using the internet, you can interact directly with sources and click on links rather than receiving paraphrased information. I better understand what I’m researching when I’m finding the information myself.
As we enter the age of AI and its use becomes more and more integrated into our daily lives, I encourage you to reflect: does the necessity of your AI search outweigh its extreme environmental cost? If your answer is no, then the solution is simple: you must be mindful of your AI usage.
