ChatGPT o3 80% price reduction has no impact on performance

The ChatGPT o3 api is now cheaper for developers and there is no visible impact on the performance.

OpenAI announced on Wednesday that it would be reducing the price of its most advanced reasoning model, o3, to 80%.

ChatGPT price reduction

This means o3’s input price is now just $2 per million tokens, while the output price has dropped to $8 per million tokens.

“We optimized our inference stack that serves o3. Same exact model—just cheaper,” Openai””https://x.com/OpenAIDevs/status/1932532777565446348″ ” rel=””nofollow noopener”” target=””_blank” “> In a post on X.

Although regular users don’t typically use ChatGPT models through API, the price reduction makes tools that rely on the API, such as Cursor or Windsurf, much cheaper. In a postposted on X by the independent benchmark community ARC Prize, it was confirmed that the performance of the o3-2025-04-16 did not change after the price drop. The company stated

“We compared the retest results with the original results and observed no difference in performance,” .

It confirms that OpenAI didn’t swap out the o3 to reduce the cost. The company has instead optimized the inference stack which powers the model. OpenAI also rolled out the o3-pro API, which uses a more powerful model to deliver better results.

Why IT teams are ditching manually patch management

Patching used be complex scripts, long days, and endless fire drills. Not anymore. Tines explains in this new guide

how modern IT organizations are advancing with automation. Patch faster, reduce overhead and focus on strategic tasks — no need for complex scripts.

www.aiobserver.co

More from this stream

Recomended