ChatGPT exploded in popularity right after its public release in November 2022. By early 2023, students were using it to write essays, job-seekers to polish resumes, programmers to debug code and marketers to draft emails.
While virtual assistants like Siri, Alexa, and Google Assistant had been around for years, they were mostly command-based and limited in depth.
ChatGPT was the first mainstream AI tool to feel genuinely conversational, creative, and context-aware. It was unsettling. So it didn’t take long for the memes to emerge.
“Making sure to type ‘Thank you’ to ChatGPT so they would spare my life if Al ever takes over the world,” said one meme shared on Reddit, one of the countless many likes.
However, it turns out that the polite inputs have a real-world cost. According to Sam Altman, all that digital politeness is racking up tens of millions of dollars in server and energy expenses.
The OpenAI CEO broke the news last week, responding to a user on X who asked how much politeness toward AI was costing the AI tool in operating expenses.
“I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models,” the user wrote.
Altman said the expense had reached “tens of millions of dollars well spent.” Not surprisingly, given that research shows many users tend to interact with chatbots politely.
A February survey found that 67 percent of US AI users are polite to chatbots. Of those, 18 percent admitted they say “please” and “thank you” to protect themselves in case of an AI uprising, while the remaining 82 percent said they’re simply being courteous.
Politeness also comes at an energy cost. A May 2024 report from the Electric Power Research Institute (EPRI) found that asking ChatGPT a question uses roughly 10 times more energy than a standard Google search without AI-generated summaries.
According to analysis by BestBrokers, ChatGPT consumes an estimated 1.059 billion kilowatt-hours of electricity per year, which translates to around $139.7 million in annual energy expenses.
The environmental toll doesn’t stop there. Generating a simple 100-word email with ChatGPT can use up to 1,408 millilitres of water to cool the servers, roughly the equivalent of three standard water bottles, according to a study by the University of California, Riverside.
Even a short response like “You are welcome” requires about 40 to 50 millilitres.
Still, in framing it as “well-spent,” Altman seems to view the cost as worthwhile.
“You never know,” he adds in the continuation of his response to the X post.
Although he often tempers his warnings with optimism, Altman, during his many speeches, has acknowledged that AI could pose existential risks to humanity if not carefully managed.