Comment by TechDebtDevin

15 hours ago

From the little we know about OpenAIs inference infra, I feel like I can confidently say that if training stopped today, and they got cut off Azure subsidies, their $20.00 subscription model would probably not cover the cost of Inference.

I know nothing about the enterprise side of OpenAI but I'm sure they're profitable there. I doubt the subscription cost of a single power user of ChatGPT Plus covers the water they consume as a single user (This is probably an exaggeration, but I think I'm in the ballpark).

It may be that extra-large LLMs don’t make sense for ChatGPT. They’re for enterprise use, like supercomputers. The reason I say “at least some” is I’ve found use in running a local instance of Llama, which seems to imply there is some niche of (legal) activities AI can support sustainably. (Versus, e.g. crypto.)