That would be great news for OpenAI if they could somehow prevent people from buying their own computers. Because as inference costs come down for OpenAI, their customers also get access to better and better commodity hardware (and also better models to run on it). And commodity models become more and more capable all the time, even if they’re not the best.
It counters the claim that "inference is too damn expensive". You can argue that cheap inference is actually bad for OpenAI because it makes it easier to run models on commodity hardware, but then you're arguing against your point that inference was too expensive for OpenAI to be sustainable. Which is it? Is inference too expensive or too cheap?
For now, OpenAI's models are closed source, so if you find their models offer the best value for your use case, you don't have the option of running it on your own hardware. If a competitor releases better products for cheaper, OpenAI will fail, just like any other company would.
It’s both. For an increasing number of tasks, it’s too cheap. People will be able to do simple things on their own hardware. Things OpenAI would love to charge high margins on. There’s probably an 80/20 rule on the horizon.
And for others, it’s too expensive. The frontier is constantly being pushed, so they can’t stop improving or they will fall behind. Google at least makes their own chips so they can control their costs somewhat.
So you suspect their cheap models have few customers because people prefer to run open source models on their own computers, and their high-end models have either very thin margins over the inference cost or few customers because the costs are too high.
Honestly, it's the training costs that will kill them. AFAIK, training costs have not come down anywhere near as much as inference costs.
And the models don't last long. So you have a rapidly depreciating capital asset that you need to provide your services, not really a recipe for a sustainable business (certainly not with the fat software margins tech companies are used to).