Exploding costs
We’re talking massive servers, data centers and a dizzying electricity bill. According to analysts, OpenAI was already spending $700,000 per day to run ChatGPT in 2023. Today, a single query with their most advanced models could cost several hundred dollars. And with around 10 million paying subscribers, that ends up being a big deal. Result: in 2024, the company recorded revenues of 3.7 billion dollars, but forecasts losses of almost 5 billion.
A model to review
Altman said he set the price for ChatGPT Pro himself, hoping it would be enough to make the deal profitable. Apparently this is not the case. Worse still, even at $200 per month, users maximize the use of features, like o1 pro mode or Sora, the video creation tool. And we understand why: the package gives them access to everything, without limits.
To redress the situation, OpenAI is seriously considering increasing the prices of its various subscriptions. The Plus plan, currently at $20 per month, could also be included. Another idea on the table: usage-based pricing. But honestly, who wants to count their requests before asking a question? It would be a risky bet.
What if prices rise?
If prices rise, many users may turn to alternatives like Google Gemini, which already offers similar features at more reasonable prices. ChatGPT still retains some advantages, particularly in terms of logical reasoning, and above all, it has already entered homes, and is well established.
OpenAI must quickly find a balance between profitability and user satisfaction. Otherwise, even longtime ChatGPT fans might be tempted to click “cancel subscription”.
Have you had the opportunity to test ChatGPT’s paid plans? Does it meet your uses?and would you be ok paying even more?