3DNews AI→ original

AI models are getting more expensive: OpenAI raised GPT-5.5 prices by 1.5-2x

OpenAI has raised prices for access to the new GPT-5.5 model through its public API. Depending on the usage type, the service now costs 1.5 to 2 times more than

AI models are getting more expensive: OpenAI raised GPT-5.5 prices by 1.5-2x
Source: 3DNews AI. Collage: Hamidun News.
◐ Listen to article

OpenAI has raised prices for access to its latest GPT-5.5 model through its API. Depending on the use case, costs have increased by 1.5 to 2 times compared to the previous GPT-5 version. This is a clear demonstration that the era of cheap access to cutting-edge AI models is coming to an end.

New pricing structure

OpenAI has applied differentiated pricing: input tokens have become significantly more expensive than output tokens. Input tokens require much more computing power to process context, so the company decided to encourage developers to economize on input data. The company has preserved several optimization tools for developers. Batch API provides a 50% discount for time-non-critical tasks. Context caching mechanism allows the system to reuse already processed context and save on repeated requests.

  • Input tokens: price increase of 50–100% depending on usage
  • Output tokens: cost increase, but less than input
  • Batch API: 50% discount for deferred processing
  • Context caching: savings on recurring contexts
  • Legacy models (GPT-3.5, GPT-4): prices unchanged
  • Fine-tuning: rates remained the same

Reasons for exponential price increases

The first reason is that infrastructure costs are growing incredibly fast. GPT-5.5 requires significantly more computational resources, video memory, and electricity than previous versions. OpenAI is constantly expanding its data centers, purchasing new GPUs and servers. Each request is processed on expensive NVIDIA accelerators, whose cost is growing along with demand.

The second reason is that demand has exceeded supply. Millions of developers and companies worldwide use OpenAI API daily to embed AI into their products. This creates enormous load on infrastructure, requiring constant scaling.

The third reason is fierce competition. Google Gemini, Anthropic Claude, Meta Llama and other players are rapidly increasing their capacity. OpenAI is forced to simultaneously improve model quality, invest in research, and prepare even more powerful versions to maintain its leadership.

"Computing investments are the main expense item in developing and maintaining cutting-edge AI models.

We are raising prices to ensure sustainability and service quality," OpenAI representatives explained to the technical press.

How this will divide the market

API price increases will create a natural market division. Large companies (Google, Meta, Microsoft, Amazon) will be able to afford expensive models for critical tasks. They already use batch API and context caching, so the price increase won't affect them much. Startups and small companies with limited budgets will be forced to reconsider their strategy: use cheaper alternatives from Anthropic or Google, optimize requests, apply caching, or train local models on their own infrastructure. This is slower, but more economical.

The trend of AI service price increases will continue. As models improve and computing resources become more expensive, prices will rise. The market is gradually filtering out projects that cannot afford the best models.

ZK
Hamidun News
AI news without noise. Daily editorial selection from 400+ sources. A product by Zhemal Khamidun, Head of AI at Alpina Digital.
What do you think?
Loading comments…