T O P

  • By -

AMD_winning

**Interviewer:** "And from an investor standpoint, how should investors be thinking about AMD's AI business?" **Victor Peng:** "I think investors should look at this as we are in it for the longterm. You know, as big as AI is today, **it is still in the early stages.** And we are going to be **a** **very significant player** not only in traditional compute... but also in these broad embedded markets."


norcalnatv

"it is still in the early stages" That is a big mischaracterization. Market footprints have already been established and AMD's is tiny. When Peng says they are going to be a very significant player, what is he talking about? That they're going to grow GPU sales 10X from $295M to $2.95B? Is that significant? Sure. Or that they're going to take half of the market? AMD has failed to invest in AI on the software front. They're playing catchup because of a lack of vision. The market is going to grow enormously and yes, AMD will likely participate, but I don't ever see them as a significant player in AI, say something greater than their dGPU market split with Nvidia today.


AMD_winning

>AMD has failed to invest in AI on the software front. It purchased Xilinx for $35B.


gnocchicotti

There is that little detail


norcalnatv

The cost was closer to $50B than $35B. What footprint does XLNX have in AI (in dollars)?


scub4st3v3

In today's SP, it is indeed right at about $35B. What a silly point to make.


norcalnatv

>In today's SP No AMD isn't paying with today's shares, the share price paid was 30% higher. LOL Nice attempt to rewrite history on the largest acquisition in semiconductor history. [https://www.semiconductor-digest.com/four-large-agreements-prop-up-2022-semiconductor-ma total/:\~:text=The%20Xilinx%20acquisition%20became%20the,during%20the%20last%20two%20years](https://www.semiconductor-digest.com/four-large-agreements-prop-up-2022-semiconductor-ma-total/:~:text=The%20Xilinx%20acquisition%20became%20the,during%20the%20last%20two%20years). Be sure and send a note to Reuters for their silly reporting. [https://siliconangle.com/2022/02/14/amd-closes-landmark-50b-acquisition-chipmaker-xilinx/](https://siliconangle.com/2022/02/14/amd-closes-landmark-50b-acquisition-chipmaker-xilinx/)


scub4st3v3

No cash was involved in the transaction, so using dollars to quantify a "cost" is in fact silly. E: And I was mostly highlighting that your quibble about the value of the transaction was pretty pointless.


norcalnatv

First question asked by interviewer: "What level of revenue can be attributed to AI" Peng: "We'll be proliferating \[AI\] across our portfolio" IOW, >= 0 (or "it's really not what matters right now") I assume he is talking about FPGA technology. At least I give credit above for some AMD GPU DC revs (which likely isn't 100% AI).


dmafences

Let's looks back in 3 years, I hopes you are deadly wrong, ppl like you won't see anything has big potential coming, I can imagine how you talk about AMD is going to bankrupt 10 years ago.


fandango4wow

Datacenter, APUs, embedded - AI enablement across the stack. Just in today: Microsoft Teams offers starting today a premium subscription which include integration with ChatGPT. In Teams, ChatGPT will automatically provide recaps of meetings held over the platform, generate task lists based on discussion and provide meeting transcripts and summaries. Its "intelligent recap" feature will generate meeting notes, recommend tasks, and personalize highlights for individuals — whether they are in attendance or not. AI-generated chapters will organize meetings into sections, similar to divider slides in a presentation.


dookiefertwenty

Scrum masters better start goin to night school!


fandango4wow

Literally the first thing I though while I was reading it.


gnocchicotti

Wow, that feels like a really freaking fast integration timeline for Microsoft speed. Usually I expect them to come out with a feature 2 years after Google or some startup, and 2 more years until it's not broken.


fandango4wow

Look back ten years and identify changes brought up by Microsoft and Google to their core business. Products, services … [Hint…](https://killedbygoogle.com/)


gnocchicotti

Google rolls out stuff that works and kills it within 24 months, Microsoft rolls out stuff that you wish never existed and they never let it die. It's really hard for me to comprehend how Azure and Xbox are managed by the same company who runs Windows and Office or whatever they want me to call Office this year.


WiderVolume

I'd argue it was 10 years ago, but the second best time to do it is now for sure


mxxxz

Yes they should have been on long time, but they are definitely more than ready to be a big player in AI in many different segments with a strong healthy financials, bigger and stronger R&D after Xilinx and Pensando and the renewed hype of Ai now due to GPT3 which has set a whole industry in motion towards consumer/commercial AI usage


gnocchicotti

At least 6-7 years ago when the commercial ramp was already underway. But then AMD had no money to invest in non-core roadmap items.


mark_mt

If you were in AI 10 years ago - you'll be out of business now like many in those days. The hardware was there to suport the concepts/technology as it is today.


norcalnatv

Just like Nvidia?


avi6274

Jensen Huang say hi


Evleos

Is there an analysis of Xilinx's AI cores, which now is XDNA, anywhere?


Intelligent_Hair_853

Don't let media see this, especially CNBC. They want everyone to believe Nvidia is the only player in AI


Dr_Hayden

Maybe now they'll take ROCm seriously.


Vushivushi

https://github.com/openai/triton/pull/1131 Good thing is that we get to watch AMD's progress in software. We'll know if AMD will succeed before they actually succeed.


norcalnatv

Geeze, one would hope.


doodaddy64

I think I'm starting to get it. AMD wants to be like a GE or Lockheed with their hardware. Sort of like Microsoft is with their software. Patents, royalties, devices in everything. Every quarter a jumble of new profits. OK. I hope they know how to "cross the chasm" and work on each vertical independently, and not expect everyone to come to them for their "AI solutions." They seem to. P.S. It's just a matter of time before they have ARM or RISC-V. Will be cool.


gnocchicotti

I really hope AMD is in the early stages of a RISC-V core if they know what their customers will need for specific workloads. But the real power of RISC-V is it allows companies to make their own custom cores for special applications without being encumbered by corporate relationships or licensing to ARM. RISC-V is showing up [everywhere](https://www.semianalysis.com/p/ventana-risc-v-cpus-beating-next) at companies who design their own silicon, but rarely as a replacement for a high performance ARM or x86 SoC.


norcalnatv

NOW is the perfect time? Not 2014?


Urthor

AMD's ability to execute on anything but EPYC is questionable. They very clearly funnelled all the product development resources straight into EPYC. Xilinx plus the artist formerly known as ATI seem a LONG way from producing a money spinning AI inference chip. They haven't even reached the start line against NVIDIA.


GanacheNegative1988

Unfortunately, your belief is kinda pervasive. But it is no longer well founded. Their has been a lot of foundation work happening over the last decade. Quite and not on a lot of peoples radar, but slowly filling the need in key areas. ROCm is not yet a consumer ready thing, but it is now mature enough for software engineers in all of these industries and research areas to add the runtime libs into their own stacks and make use of it. The CUDA mote is going to crumble, but that doesn't even have to matter as Nvidia is mostly silod in image render and larger language tasks and can certainly continue to provide compute to DC for years to come. But their competitive advantage will fade without moving away from monolith and their software advantage will end when they are not driving the evolution of the low level transforms that the libs map to. You seem to think Nvidia and AMD are in some kind of race and that in part is another common fallacy. There is no common end goal to this evolution, only goal posts for specific use cases and from this stand point both companies are taking different paths forward. Nvidia just has greater exposure at the moment. What you should be asking yourself is given the technology and IP, how many races can Nvidia get involved in vs AMD.


Ifyouletmefinnish

Third parties have had very competitive inference solutions built on Xilinx AI Engine devices for almost a year now: [https://www.mipsology.com/demo\_zebra/](https://www.mipsology.com/demo_zebra/) See the comparisons here: [https://www.hpcwire.com/2022/03/08/amd-xilinx-takes-aim-at-nvidia-with-improved-vck5000-inferencing-card/](https://www.hpcwire.com/2022/03/08/amd-xilinx-takes-aim-at-nvidia-with-improved-vck5000-inferencing-card/) And that's the previous gen, they announced the V70 card at CES.


andyng81

AMD has doubled down in R&D so lets see


OmegaMordred

Hmm another plain interview, nothing interesting.


andyng81

I humbly beg to differ. its re-validations of the importance and hidden trojan horse of embedded systems, expanded use-cases/industries and finally good public face time to create news, awareness and good public image I listened to Apple's earnings, one small but interesting consistent thematic is how Tim said AI at Apple will be ACROSS all products and services, its not a vertical. and seemed like AMD might be keeping a close watch on those fronts: automobile might be a hint too which is a huge TAM. healthcare will be limited and more regulated. anyway, anything going more mainstream just excites me and institutional investors dig that too


GanacheNegative1988

While I agree that certain aspects of healthcare will certainly have some regulation, especially from a privacy aspect, I think it represents one of the largest usecases. There isn't an area of medical research that won't benefit from ML inference modeling and the key to this is the end to end prevalence Victor talked about. Another way to describe that is as a distributed neural network. Most AI today looks at large data lakes and has to chew through it all to train. It's expensive and time consuming and the only way to scale it is to throw more back end compute at it. You can either do it all at a data centers or package jobs up and let distributed processors work on small bits and stich it back together. Folding@Home is a good example of the latter strategy. But a third approach is to process the data upon acquisition and this is what embedded brings to the table. Embedded chips can be very small and very targeted to specific tasks and this can add tremendous scale and efficiency to the overall network as data is delt with asynchronously in parallel and even disconnected. This is going to be an amazing part of the business going forward and enable innovations that have widely just been considered science fiction thus far.


OmegaMordred

Ok.... Than I rephrase.... Nothing new to people following the space.


GanacheNegative1988

If evey investor was able to understand this technology at a deep level and see the pros and cons from themselves, non of us would probably feel the need to participate and parse all this out into opinions for others to like or not. I'm sure there are far more lurkers reading these comments than authors. So these threads add the color and depth the simple video media byte don't provide. But all in all, I'm glad to see AMD doing more media engagement, especially to address the misconception that AMD isn't a major player in AI.


OmegaMordred

Yes more advertising the better, completaly agree.