200 Million People Use ChatGPT Daily. OpenAI Can’t Afford It Much Longer

The company lacks the computing power it needs

Alberto Romero

--

GPT-5 will have to wait.

To the surprise of many, OpenAI reported in June that they were not yet training GPT-5.

It’s not been confirmed if they have started as of August 26th. The good news is they filed a trademark application in July for “GPT-5" and developer Siqi Chen has been “told” (presumably by someone inside OpenAI) that they would finish training the model by the end of 2024.

But even if OpenAI finishes training GPT-5 this year, they will probably not immediately allow users to access the model. They can’t afford it.

This article is a selection from The Algorithmic Bridge, an educational newsletter whose purpose is to bridge the gap between AI, algorithms, and people. It will help you understand the impact AI has in your life and develop the tools to better navigate the future.

The Algorithmic Bridge is 30% off until September 14th!!

The GPU shortage is forcing OpenAI to delay its plans

This has been an open secret in the industry since Q2 2023. Now AI leaders are explicitly admitting this annoying reality: There are not enough cutting-edge Nvidia H100 GPUs to satisfy the demand of cloud providers (Google, Microsoft, and AWS are the big three) and LM builders (mainly Google, Meta, OpenAI, Anthropic, and Inflection). They won’t reach supply-demand equilibrium until at least Q4 2023.

OpenAI admitted in May they delayed their short-term plans for this very reason (the source is a Humanloop article that was taken down after OpenAI requested it). That’s also the real reason they were not training GPT-5 (as a reminder, although GPT-4 was released in March 2023, it finished training in Summer 2022, so they’ve had one year already to train GPT-5).

Not much earlier, also in May, Altman had told the Senate, half-jokingly, that he wished people used ChatGPT less because they “don’t have enough GPUs,” explicitly pointing out this very issue. The number of daily users has only grown since, up to ~200 million, according to Sayash Kapoor, Ph.D. candidate at Princeton…

--

--