← Back to context

Comment by refurb

12 days ago

I’m not anti-CEO, I think they play an important role, but why would you interview a CEO about hard technological problems?

She worked as a researcher in the field for decades. Moore's law only kept up because of some of the techniques she developed.

> During her time at IBM,[6] Su played a "critical role"[7] in developing the "recipe"[2] to make copper connections work with semiconductor chips instead of aluminum, "solving the problem of preventing copper impurities from contaminating the devices during production".[7] Working with various IBM design teams on the details of the device, Su explained, "my specialty was not in copper, but I migrated to where the problems were".[6] The copper technology was launched in 1998,[7] resulting in new industry standards[22] and chips that were up to 20% faster than the conventional versions.[6][7]

AMD was close to ruin when Lisa took over. It had completely lost the graphics and x64 wars and was limping by on low margin games consoles.

Since then Epyc has broken Intel. Anyone buying Xeon's today is at credible risk of being fired over it when someone notices the power bill relative to their competition.

The graphics ramp is behind Nvidia in mind share and market share. The ROCm software stack gets a lot of grief on here. Nevertheless, Nvidia lost the frontier and el cap bids and now Microsoft is running GPT4 on AMD hardware. Sounds pretty good to me given it's one product line of many.

If turning a multinational semiconductor firm away from the edge of bankruptcy and into the profitable conglomerate it is today doesn't qualify as "solving a hard problem" I don't know what would. It's way beyond what I'd be capable of doing.

  • Major clouds have been investing in ARM alternatives. x86 is still king compatibility matters a lot. But it is not as simple as Su paints for teams of x86 chip designers to switch to arm chip designers and reach top place, specifically because fabs also matter and AMD is the player in the market with less money to pay the fabs.

    GPU market would be hard to recover, and reason to use AMD (for the money printing AI) is just budget. Software is not good enough, it’s understandably not in AMD’s DNA, it was simply lacking in budget as it was close to bankruptcy when CUDA started taking off.

    emphasis on top, ofc great designers would still design great no matter the ISA, but best and great is different

Lisa Su is one of the (surprisingly rare) tech CEOs who comes from an engineering background.

  • You mean like the CEOs of Intel, Nvidia, TSMC, ASML, Micron etc?

    • I've noticed that it's all the chip companies that have CEOs that understand to a deep level the stuff that the company is actually doing.

      Compare and contrast the big software / services tech firms...

      It feels like companies like Intel, AMD, Nvidia and TSMC are much more about delivering good technical solutions to hard problems than software companies like Google or Microsoft that try to deliver "concepts" or "ideas".

      I do sometimes wonder though why decision making at a company like AMD benefits so much from having a highly competent engineer as the CEO compared to let's say Oracle...

      5 replies →

    • In general it's very common for CEOs to be of a business/MBA background. Apple and Google both have this, along with tons of other companies outside the semiconductor industry.

      2 replies →

Much of the hard problems these days require scale and with it coordination

It says in the article that he was asked by readers to do a Lisa Su interview. The headline is a little misleading, they don't talk much about technological problems. The interview is a soft light tour of her career and a few attempts to get her to talk about the present day business. Interviews with someone like Musk or Jensen are much more technically intense.

Honestly this interview feels bearish for AMD. Su's performance is not good. Thompson repeatedly pushes her to reflect on past mistakes, but it's just not happening. The reason why AMD has fallen so far behind NVIDIA shine through clear as day and it doesn't look like it's going to get fixed anytime soon.

Su's problem and therefore AMD's problem is that she doesn't want to think about software at all. Hardware is all she knows and she states that openly. Nor does she seem to consider this a weakness. The problem goes back to the very start of her career. The interview opens with Thompson saying she faced a choice between computer science and electronics engineering at MIT, and she picked EE because it was harder. Is that true? She's nowhere in AI due to lack of sufficient skilled devs so now would be a good time to talk up the importance of software, but no, she laughs and says sure! CS seemed easy to her because you "just" write software instead of "building things", whereas in electronics your stuff "has to work". End of answer.

He tries to get a comment on the (in hindsight) not great design tradeoffs made by the Cell processor, which was hard to program for and so held back the PS3 at critical points in its lifecycle. It was a long time ago so there's been plenty of time to reflect on it, yet her only thought is "Perhaps one could say, if you look in hindsight, programmability is so important". That's it! In hindsight, programmability of your CPU is important! Then she immediately returns to hardware again, and saying how proud she was of the leaps in hardware made over the PS generations.

He asks her if she'd stayed at IBM and taken over there, would she have avoided Gerstner's mistake of ignoring the cloud? Her answer is "I don’t know that I would’ve been on that path. I was a semiconductor person, I am a semiconductor person." - again, she seems to just reject on principle the idea that she would think about software, networking or systems architecture because she defines herself as an electronics person.

Later Thompson tries harder to ram the point home, asking her "Where is the software piece of this? You can’t just be a hardware cowboy ... What is the reticence to software at AMD and how have you worked to change that?" and she just point-blank denies AMD has ever had a problem with software. Later she claims everything works out of the box with AMD and seems to imply that ROCm hardly matters because everyone is just programming against PyTorch anyway!

The final blow comes when he asks her about ChatGPT. A pivotal moment that catapults her competitor to absolute dominance, apparently catching AMD unaware. Thompson asks her what her response was. Was she surprised? Maybe she realized this was an all hands to deck moment? What did NVIDIA do right that you missed? Answer: no, we always knew and have always been good at AI. NVIDIA did nothing different to us.

The whole interview is just astonishing. Put under pressure to reflect on her market position, again and again Su retreats to outright denial and management waffle about "product arcs". It seems to be her go-to safe space. It's certainly possible she just decided to play it all as low key as possible and not say anything interesting to protect the share price, but if I was an analyst looking for signs of a quick turnaround in strategy there's no sign of that here.

  • In my point of view AMD is going down not because nVidia, but because of ARM and Qualcomm. AMD Ryzen x64 cash cow is going to start declining soon both in the server and consumer space.

    I saw this clear as day when M1 Macbooks came out and Amazon AWS Graviton servers becoming more popular and cheaper. It was inevitable that the PC world was going to move to ARM soon, in fact I am surprised that it took this long to get viable ARM PC laptops (only this year).

    So unless AMD has some secret ARM or RISC-V research division close to launch a product I don't see how it is going to survive long term.

    • > So unless AMD has some secret ARM or RISC-V research division close to launch a product I don't see how it is going to survive long term.

      AMD has already built ARM chips and uses a modified ARM core for their Platform Security Processor. They have an architectural license and have already committed to launching ARM chips by 2025.

      Why would they need to make it a secret? And what makes you think that they are?

    • ARM frontend was already a thing at AMD decade ago.

      Btw. As for now ARM has bigger market share in peoples minds than actual sales and it aint going to eat x86 even by 2030

    • The GPU scheduler is an arm chip. As in taped out, shipping in volume, has been for years. I think there was a project decades ago that put an arm front end on an x86 core. If industry decides they like the aarch64 isa more than x64 I doubt it would cause much trouble to the silicon team.

      2 replies →

    • Apparently someone at AMD (Su herself?) had mentioned that making an arm frontend for ryzen isn't that impossible. Perhaps they already have prototypes lying around their labs?

      1 reply →

  • Electrical engineers do generally think software is easy. Even when their day is a horror show of TCL and verilog. In fairness I think hardware is horrendously difficult, so maybe they're not wrong.

    • chuckle they fact that they can solve their problems with TCL and verilog should be all the proof it takes to conclude that their jobs are easier.

      But the guys who really have it hard are guys writing the EDA software. You have to be expert in both chip design and software development. You are writing software to design tomorrow's computers, and it has to run on today's computers. Virtually every problem is NP-complete, and the problem size is growing with Moore's law.

      And all your customers just LOOOOVE TCL :-) so you end up writing a lot of that too :-)

    • Most bachelor's and master's level CS is comparatively easier than EE because EE requires much more hard math. The theory at least, but project-wise CS is more demanding. I had two EE roommates in college, their exams were HARD, but their home-projects were easy compared to CS (less projects overall as well).

      I remember one exam my roommate complaining about was about getting all the formulas he needed into his scientific calculator before the exam even started. If you understood how to derive all the formulas and knew how to put them in the calculator and how to use them you passed the exam. I think it was analog circuit processing exam but I might be wrong.

      Research-level in computer science can get very hard as well though. A lot of it is more pure mathematics than engineering.

      4 replies →

    • They are quite different skills, I think. Being good at one doesn't tend to mean being good at the other (and this also applies to organisations: hardware companies will tend to suck at software by default and vice-versa. Not because they're necessarily anti-correlated but because they're sufficiently uncorrelated that they will tend to be average, i.e. bad in the area they aren't doing well in). But then there's a pretty wide spread of difficulty and skills in the individual parts of either which dwarfs any difference in the average.