Why Saying “Please” to ChatGPT Is More Expensive Than You Think
Introduction to AI Politeness
Sam Altman, CEO of OpenAI, recently mentioned that all those extra ‘pleases’ and ‘thank yous’ to ChatGPT are quietly racking up tens of millions of dollars in computing costs.
Sounds wild, right? But it’s true—and it all comes down to how AI processes language.”

The Hidden Cost of Politeness
Being polite in AI isn’t free. Every extra word—especially those designed to sound more human—adds complexity. The model doesn’t just match text; it evaluates tone, sentiment, and context to stay in character. That takes serious energy.
In high-volume environments, these polite niceties—multiplied across billions of prompts—lead to massive increases in resource consumption.
For example, when a user asks a question, the AI must evaluate multiple potential responses to ensure the reply aligns with polite language norms. This requires complex algorithms that analyze syntax, semantics, and even emotional cues to craft the right response.
Balancing Performance & Sustainability
Companies like OpenAI are working to make their systems more efficient without sacrificing quality. Techniques such as model pruning and algorithm optimization aim to maintain human-like conversation while reducing environmental impact.
Users can contribute too. More concise prompts and thoughtful interactions help lighten the load on servers—supporting both sustainability and accessibility.

Final Thought
Yes, politeness matters. But when it comes to generative AI, every word carries weight—computationally, financially, and environmentally.
The next time you chat with ChatGPT, remember:
A little brevity goes a long way.