Saying “please” and “thank you” to ChatGPT costs OpenAI “tens of millions of dollars”, according to Sam Altman

OpenAI chief executive Sam Altman says the company spends “tens of millions of dollars” every year because users add polite phrases like “please” and “thank you” to their ChatGPT prompts. Altman made the comment on X after a user asked about the extra electricity costs of the extra words.​

Asked on 16 April how much the courtesies cost, Altman replied, “tens of millions of dollars well spent — you never know,” suggesting he considers the expense worthwhile for a more human-sounding service.​

The throw-away line quickly ricocheted across tech media. Crypto outlet Cointelegraph and futurist site Futurism both highlighted the tweet, sparking fresh debate over the true cost of large-language-model (LLM) chat.​

Hardware publication Tom’s Hardware noted that even a three-word reply such as “You are welcome” consumes compute, water and cooling resources, adding further environmental weight to the seemingly minor tokens.​

Each extra word adds 1-2 tokens to a prompt. At ChatGPT’s reported scale of hundreds of millions of requests per day, those tokens pile up into trillions per year, multiplying cloud-GPU and power bills.

Industry analysts estimate processing costs for frontier models at roughly US$0.00001–$0.00003 per token. At that rate, a few trillion surplus tokens would indeed reach the “tens of millions” range that Altman mentioned.

OpenAI does not itemise the figure in its financial statements, and the company says the tweet was an off-the-cuff estimate rather than an audited cost breakdown.

Still, the exchange underscores how small bits of digital politeness scale into very real data-centre expenses — and why every token counts when running one of the world’s most widely used AI services.

Total
0
Shares
Related Posts