Comment by m_ke

15 hours ago

I wonder when Meta, Microsoft and OpenAI will partner on an open chip design to compete with NVIDIA.

They’re all blowing billions of dollars on NVIDIA hardware with like 70% margin and with triton backing PyTorch it shouldn’t be that hard to move off of CUDA stack.

It would require a fairly big bet on AI models not changing structure all that much, I guess

I think if the tooling and supply chain falls into place it would surprise me if meta and friends didn't make their own chips, assuming it was a good fit of course.

Note: AMD's missed opportunity here is so bad people jump "make their own chip" rather than "buy AMD". Although watch that space.

What is a large mount of money to you, is not that significant to some of these companies. I suspect for the vast majority of these companies, it still represents a small expense. General sentiment is that there is probably overspending in the area but its better to spend it and not risk being left behind.

  • Half of NVIDIA's 2nd quarter revenue (30 billion) came from 4 customers, with Microsoft and Meta already having spent 40-60 billion each on GPU data centers (of which most goes to NVIDIA). "Open"AI just raised a few billion and is supposedly planning on building their own training clusters soon.

    For a small fraction of that they could poach a ton of people from NVIDIA and publish a new open chip spec that anyone could manufacture.

    https://www.fool.com/investing/2024/09/12/46-nvidias-30-bill...

    • That again underestimates the challenges in that undertaking. After all these costs are still drops on the bucket. Why distract yourself from your business to go and build chips.

      They all use SFDC, should they go and create and open source sales platform?

      1 reply →

Each of them is designing their own hardware. The goal isn't really to compete with nvidia though, whose market is general purpose GPU compute. Instead they're customizing hardware for inference to drive down product cost.