← Back to context

Comment by mike_hearn

16 days ago

It says in the article that he was asked by readers to do a Lisa Su interview. The headline is a little misleading, they don't talk much about technological problems. The interview is a soft light tour of her career and a few attempts to get her to talk about the present day business. Interviews with someone like Musk or Jensen are much more technically intense.

Honestly this interview feels bearish for AMD. Su's performance is not good. Thompson repeatedly pushes her to reflect on past mistakes, but it's just not happening. The reason why AMD has fallen so far behind NVIDIA shine through clear as day and it doesn't look like it's going to get fixed anytime soon.

Su's problem and therefore AMD's problem is that she doesn't want to think about software at all. Hardware is all she knows and she states that openly. Nor does she seem to consider this a weakness. The problem goes back to the very start of her career. The interview opens with Thompson saying she faced a choice between computer science and electronics engineering at MIT, and she picked EE because it was harder. Is that true? She's nowhere in AI due to lack of sufficient skilled devs so now would be a good time to talk up the importance of software, but no, she laughs and says sure! CS seemed easy to her because you "just" write software instead of "building things", whereas in electronics your stuff "has to work". End of answer.

He tries to get a comment on the (in hindsight) not great design tradeoffs made by the Cell processor, which was hard to program for and so held back the PS3 at critical points in its lifecycle. It was a long time ago so there's been plenty of time to reflect on it, yet her only thought is "Perhaps one could say, if you look in hindsight, programmability is so important". That's it! In hindsight, programmability of your CPU is important! Then she immediately returns to hardware again, and saying how proud she was of the leaps in hardware made over the PS generations.

He asks her if she'd stayed at IBM and taken over there, would she have avoided Gerstner's mistake of ignoring the cloud? Her answer is "I don’t know that I would’ve been on that path. I was a semiconductor person, I am a semiconductor person." - again, she seems to just reject on principle the idea that she would think about software, networking or systems architecture because she defines herself as an electronics person.

Later Thompson tries harder to ram the point home, asking her "Where is the software piece of this? You can’t just be a hardware cowboy ... What is the reticence to software at AMD and how have you worked to change that?" and she just point-blank denies AMD has ever had a problem with software. Later she claims everything works out of the box with AMD and seems to imply that ROCm hardly matters because everyone is just programming against PyTorch anyway!

The final blow comes when he asks her about ChatGPT. A pivotal moment that catapults her competitor to absolute dominance, apparently catching AMD unaware. Thompson asks her what her response was. Was she surprised? Maybe she realized this was an all hands to deck moment? What did NVIDIA do right that you missed? Answer: no, we always knew and have always been good at AI. NVIDIA did nothing different to us.

The whole interview is just astonishing. Put under pressure to reflect on her market position, again and again Su retreats to outright denial and management waffle about "product arcs". It seems to be her go-to safe space. It's certainly possible she just decided to play it all as low key as possible and not say anything interesting to protect the share price, but if I was an analyst looking for signs of a quick turnaround in strategy there's no sign of that here.

In my point of view AMD is going down not because nVidia, but because of ARM and Qualcomm. AMD Ryzen x64 cash cow is going to start declining soon both in the server and consumer space.

I saw this clear as day when M1 Macbooks came out and Amazon AWS Graviton servers becoming more popular and cheaper. It was inevitable that the PC world was going to move to ARM soon, in fact I am surprised that it took this long to get viable ARM PC laptops (only this year).

So unless AMD has some secret ARM or RISC-V research division close to launch a product I don't see how it is going to survive long term.

  • > So unless AMD has some secret ARM or RISC-V research division close to launch a product I don't see how it is going to survive long term.

    AMD has already built ARM chips and uses a modified ARM core for their Platform Security Processor. They have an architectural license and have already committed to launching ARM chips by 2025.

    Why would they need to make it a secret? And what makes you think that they are?

  • ARM frontend was already a thing at AMD decade ago.

    Btw. As for now ARM has bigger market share in peoples minds than actual sales and it aint going to eat x86 even by 2030

  • The GPU scheduler is an arm chip. As in taped out, shipping in volume, has been for years. I think there was a project decades ago that put an arm front end on an x86 core. If industry decides they like the aarch64 isa more than x64 I doubt it would cause much trouble to the silicon team.

    • > If industry decides they like the aarch64 isa more than x64 I doubt it would cause much trouble to the silicon team.

      the problem isn't ISA so much as AMD not having a moat around its CPU revenue anymore. Yes, AMD will have ARM chips... and so will Qualcomm, and Mediatek/NVIDIA, and Marvell. AWS and Facebook and Google have already largely dumped x86 chips entirely. But they haven't dumped them for ARM chips from AMD, or Intel, or Qualcomm, they've dumped them for ones they develop themselves at rates that would be commercially unviable for AMD to compete with as a semicustom product (and those companies would not want that in the first place).

      It's not about Intel vs AMD and which of the two is faster anymore. It’s about the fact that intel plus AMD is going to shrink to be a minority player in the market as ARM commoditizes a market from which Intel plus AMD extracted economic rents for decades. That free money train is coming to a stop. Absolutely there is going to be a market for AMD as a consultency business helping others integrate chips (just like semicustom today) but even that is going to be something that many others do etc, and the IP cores themselves certainly don't have to be licensed from AMD anymore, there will be many viable offerings and AMD's margins will be thinner when they provide less of the stack and design services etc. Absolutely there will be people willing to pay for the cores too, but it's going to be a lot less than the people who want something that's Good Enough to just hook up their custom accelerator thing to a CPU (just like AWS Graviton and Google TPU etc).

      And while consoles are still going to have legacy support as a major inertia factor, even that is not forever, the ftc leak showed Microsoft bid the next-gen console out as a “ML super resolution upscaling” and RTGI focused ARM thing, right? Especially with how far AMD has fallen behind in software and accelerator integration etc. ARM will eventually break that moat too - especially with nvidia pushing hard on that with switch 2 as well etc. Even these ancillary services that AMD provides are largely important because of the most that x86 provides around anyone else having easy interoperation.

      Margins are going to come down, and overall market will grow but AMD will command less of it, and also large customers are going to depart that market entirely and do it themselves (which AMD will also get a chunk of, but a much smaller pie etc than selling the whole CPU etc).

      Again, like, the problem isn't that AMD doesn't have a Snapdragon X Elite. It isn't even about whether they're faster than Snapdragon X Elite. It's the fact that Snapdragon X Elite is going to do to client revenue what Graviton and TPU are already doing to your datacenter revenue. It's what ARM and Qualcomm and NVIDIA are going to do to your x86 and graphics SIP licensing revenue, and your semicustom integration revenue etc. Even if they are flatly not as good, they don't have to be, in order for them to collapse your x86 rents and your margin on the resulting design+integration services and products. We will come to realize we didn’t need to pay as much for these things as we did, that integration isn’t as expensive when you have five similar blocks from 5 different companies and three places that can integrate the SOC. That inefficiency is all part of the x86 rents/x86 tax too.

      This isn't to say you're doooomed but like, Intel and AMD are not winners from the powerplay that Microsoft is making against x86 right now. You literally are going to be facing 3-4 more well-funded large competitors who can now offer services that are now substitutable for yours in a way that nobody appreciated would happen a year ago. Windows on ARM being a viable proposition in terms of support and emulation changes the game on the x86 rents in client/gaming markets, and companies are moving in to take advantage of that. Intel and AMD have literally nowhere to go but down on that one - they will not control the resulting market anymore, actually they already are losing control of large parts of the datacenter market due to cloud ARM offerings from hyperscale partners. Now it's client too, and foreseeable risk to consoles and semicustom etc.

      All of that ancillary revenue was just gated by x86 in the final measure. Now that x86 is no longer a necessity, we’re going to see a significant increase in market efficiency, and parties that are dependent on the largesse of those rents are going to have problems. You can’t rely on graphics being a loss-leader (or low-margin-leader) to sell x86 SOCs, for example.

      1 reply →

  • Apparently someone at AMD (Su herself?) had mentioned that making an arm frontend for ryzen isn't that impossible. Perhaps they already have prototypes lying around their labs?

    • Even if that is the case and they can just port their designs to ARM they will still be facing much more competition, not only from Qualcomm but also from the cloud vendors in the server space (AWS, Azure, GCP).

      All the cloud server hardware is getting more and more vertically integrated, why would the cloud vendors pay for AMD hardware when they can build their own?

  • Did you read my post from 4 years ago?

      Is AMD the king of the Titanic (x86)?
    

    https://www.reddit.com/r/AMD_Stock/comments/kg4e8j/is_amd_th...

    I basically outlined why I would not invest in AMD and it's inevitable that ARM would take over servers and personal computers.

    • No I did not read it, we just arrived at the same conclusions although you were a bit earlier than me to realise this. What opened my eyes was the easy of transition to the ARM-based macs. I fully agree with all your points and that has been my view since around 2021 (when I got an M1 mac).

      Once dev computers are running ARM at large no one is going to bother cross-compiling their server code to x64, they will just compile to ARM which will tear through AMD server demand. In fact my own org already started migrating to AWS graviton servers.

      And this bodes poorly for Nvidia as well, I bet all cloud providers are scrambling to design their own in-house alternatives to nVidia hardware. Maybe alternatives to CUDA as well to either remove the nVidia lock-in or create their own lock-ins. Although Nvidia is much better positioned to stay ahead in the space.

      2 replies →

Electrical engineers do generally think software is easy. Even when their day is a horror show of TCL and verilog. In fairness I think hardware is horrendously difficult, so maybe they're not wrong.

  • chuckle they fact that they can solve their problems with TCL and verilog should be all the proof it takes to conclude that their jobs are easier.

    But the guys who really have it hard are guys writing the EDA software. You have to be expert in both chip design and software development. You are writing software to design tomorrow's computers, and it has to run on today's computers. Virtually every problem is NP-complete, and the problem size is growing with Moore's law.

    And all your customers just LOOOOVE TCL :-) so you end up writing a lot of that too :-)

  • Most bachelor's and master's level CS is comparatively easier than EE because EE requires much more hard math. The theory at least, but project-wise CS is more demanding. I had two EE roommates in college, their exams were HARD, but their home-projects were easy compared to CS (less projects overall as well).

    I remember one exam my roommate complaining about was about getting all the formulas he needed into his scientific calculator before the exam even started. If you understood how to derive all the formulas and knew how to put them in the calculator and how to use them you passed the exam. I think it was analog circuit processing exam but I might be wrong.

    Research-level in computer science can get very hard as well though. A lot of it is more pure mathematics than engineering.

    • As far as undergraduate work goes EE is harder due to the math background required, indeed. However, the thing is, if you take the same brilliant minds who would ace EE, and reallocate them to software, they won't magically throttle down and limit themselves to undergrad CS concepts; they will find and tackle all the complexity they can withstand. You end up with the JS ecosystem, CI/CD, IaC, columnar databases, and so on. So I wonder if some of this is happening where thinking that AMD doing undergrad CS-level effort is all there is, where there is actually invisible complexity that is being missed that NVidia managed to tackle.

      1 reply →

    • > I had two EE roommates in college, their exams were HARD, but their home-projects were easy compared to CS (less projects overall as well).

      Maybe that's just a result of EE take-home projects being less practical? Hold on, let me walk on over to my wire bonding station ...

      In my applied EM class in college, we had a year-end project in which we built an antenna of a specified type (e.g., helical, corner reflector, etc ... ). The final exam was essentially a do or die transmitter hunt. We had a lot of open lab time to do it. But that project was an exception, not the norm.

      1 reply →

  • They are quite different skills, I think. Being good at one doesn't tend to mean being good at the other (and this also applies to organisations: hardware companies will tend to suck at software by default and vice-versa. Not because they're necessarily anti-correlated but because they're sufficiently uncorrelated that they will tend to be average, i.e. bad in the area they aren't doing well in). But then there's a pretty wide spread of difficulty and skills in the individual parts of either which dwarfs any difference in the average.