Most of us don’t think twice before typing something into an AI chatbot. A random question, a casual greeting, or even a polite “thank you” at the end may all feel harmless. For example, if you look at X, where Grok 4, the chatbot created by Elon Musk’s xAI, roams, you will see thousands of people tagging the AI chatbot in all things light and serious. Grok bhai, check this — it is often a repeated message on X.
But behind the scenes, every single message we send to AI tools like Grok, ChatGPT, DeepSeek, or any other chatbot uses electricity, server space, and other resources. The very real pressure they put on the energy systems is beginning to be noticed not just by tech companies but also largely by policymakers, activists and all those who are trying to keep the planet cool in the middle of global warming.
You see, these chatbots run on massive data centres that need huge amounts of energy to operate. That means even a simple and unnecessary query uses up resources. And when you multiply that by millions of users doing the same thing every day, it starts to add up for tech companies, and in the grander scheme of things, for the planet.
You may wonder what are we trying to imply here? Let us explain. On a fine April day, an X user, who goes by the name Tomie, asked a simple question, “I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models.” Now, this was meant as a lighthearted post, but OpenAI CEO Sam Altman responded with, “Tens of millions of dollars well spent — you never know.” That reply caught people’s attention. It got them thinking, is being polite to AI really costing millions? And if yes, what does that mean for energy use and the environment?
Generative AI — Grok 4, ChatGPT, Gemini and the likes — uses extremely high amounts of energy, especially during the training phase of models. But even after training, every single interaction, no matter how small, requires computing power. Those polite phrases, while sweet, still count as queries, whether they are serious or not. And queries take processing power, which in turn consumes electricity. You see the pattern? It’s all interrelated.
Energy use, but just how much?
The AI systems are still relatively new. So, precise and more concrete details about how much energy they use are still coming in. But there are some estimates.
For example, the AI tool DeepSeek estimates that a short AI response to something like “thank you” may use around 0.001 to 0.01 kWh of electricity. That sounds tiny for a single query. But scale changes everything. If one million people send such a message every day, the energy use could reach 1,000 to 10,000 kWh daily. Over a year, that becomes hundreds and thousands of megawatt-hours, enough to power several homes for months.
Similar energy use is across AI systems. MIT Technology Review carried out a study in May and came up with some figures. Among the many conclusions it reached was the estimate of energy use that a person who actively uses AI would force the system to consume in a day. “You’d use about 2.9 kilowatt-hours of electricity — enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours,” the study noted.
Such high energy use by AI systems has prompted tech companies to look for an energy source. From Google to Microsoft to Meta, they are all trying to either get into nuclear energy or have tied up with nuclear plants that generate energy. But some companies, unable to secure 100 per cent clean energy, are even trying to use more traditional ways to produce electricity. xAI, which is now running one of the largest clusters of computing power to operate Grok 4, was recently in the news because, in Memphis, it started using methane gas generators. The move prompted a protest from the local environmental group, Memphis Community Against Pollution. “Our local leaders are entrusted with protecting us from corporations violating on our right to clean air, but we are witnessing their failure to do so,” the group noted.
But are a ‘please’ and ‘thank you’ still worth it?
Of course, not everyone agrees on the impact of AI energy use on the environment. Some people think it’s being blown out of proportion.
Kurtis Beavers, a director at Microsoft Copilot, even argues that even frivolous messages, including politeness, have benefits. In a Microsoft WorkLab memo, he said that using basic etiquette with AI leads to more respectful and collaborative outputs. Basically, in his view, being polite to an AI chatbot improves responsiveness and performance, which might justify the extra energy use.
Similarly, Elon Musk’s AI chatbot Grok too, sees things a bit differently. In its own response to the aforementioned debate, Grok said that the extra energy used by polite words was negligible in the bigger picture. Even over millions of queries, Grok 4 says, the total energy use would be about the same as running a light bulb for a few hours. In the chatbot’s words, “If you’re worried about AI’s environmental footprint, the bigger culprits are model training (which can use thousands of kWh) and data centre cooling. Your polite words? They’re just a friendly whisper in the digital void.”
– Ends