While OpenCL was simply not equivalent to CUDA, I think you're correct that those other enterprises (Apple, AMD and similar) that could challenge Nvidia on the high-end GPU front simply choose not to. The thing is, the reason is if there was competition in this market, prices would sink much closer to costs and no one would be making bank whereas a large enterprise would want a higher return.
Also, a consumer-grade GPU can used for neural net training at the researcher level but large corporate use requires H100/A100 and that is what's getting traction.
> prices would sink much closer to costs and no one would be making bank
For Apple and AMD, that's not really a problem. Both of them drive considerable (40%+) margins on their products and can afford to drive things closer to the wire.
I also think more competition here would be good (and I do love lower prices) but Nvidia charges more here because they know they can. It's value-based marketing that works, because their software APIs aren't vaporware.
> large corporate use requires H100/A100 and that is what's getting traction.
I guess... you really need a strict definition of "requires" for that to hold true. For every non-"competing with ChatGPT" application, you could probably train and deploy with consumer-grade cards. You're technically right here though, and it invites the conversation around what actually constitutes abusive market positioning. Nvidia's actions here really aren't much different than AMD and Intel separating their datacenter and PC product lines. It's a risky move from a "keeping both users happy" standpoint, but hardly anticompetitive.
Both of them drive considerable (40%+) margins on their products and can afford to drive things closer to the wire.
They could that - but the reason they command these margins is exactly because they don't do that. I think do something like for fairly some investment but producing products that would compete with Nvidia would require a significant percentage amount of capital for any company - those dealing with tens of billions of dollar chunks expect above commodity revenues.
It's not like I like the situation. I wish things were like 90s with a lot of competition making sure individual end-consumers got most of the benefits of Moore's law.
But, putting on my economist hat, not all market-structures naturally generate large-scale competition in the fashion of white box PC clones. Some market structures are naturally monopolies (energy), some are naturally oligopolies (automobiles) and some naturally have a dominant player plus marginal players arrayed around them.
There's just no easy solution to this. That said, it's not like we don't GPUs of unprecedented power available at a variety of price levels.
Also, a consumer-grade GPU can used for neural net training at the researcher level but large corporate use requires H100/A100 and that is what's getting traction.