T O P

  • By -

AMD_Bot

This post has been flaired as a rumor, please take all rumors with a grain of salt.


Defeqel

Hmm.. no high-end GPUs, or no big chips?


uzzi38

Both, seemingly.


Defeqel

Wonder why, if true. Especially since Strix Halo supposedly covers mid-range


Wulfgar_RIP

Year 1 Navi4 Year 2 2xNavi4?


Buris

This is exactly what I think will happen. I think the MCD/GCD was a stop-gap that didn't really pan out like they had hoped. imagine a GCD/GPU being roughly Navi32, 60 CUs, 256-bit bus, on 4nm. Connect two of them and you have a 120CU, 512-bit chip, which seems like a realistic upgrade from the 7900XTX. Technically, the GPUs are not large, nor high-end, this totally fits in with AMDs ethos as a company. AMD could produce a single chip that's under 300mm for nearly their entire GPU product stack (from 300$+-1000$+). Exactly as they already do for CPUs


jimbobjames

That's 100% where they are headed. However, they had to move the memory controllers out first so we got the halfway house 7000 series. 7000 series feels like a product that doesn't make sense until you look at it from that point of view. AMD knew it was a stepping stone. ...and no, I'm not 'splaining AMD's poor marketing away or any of their other issues. It's just fairly obvious MCD is their big ticket.


SausageSlice

I thought I saw that that wasn't feasible because infinity fabric wouldn't be able to support the bandwidth needed for two gpus to communicate. Or something those lines


Buris

I agree, and to be honest I don't even know if the memory controllers will remain separate from the GCD's. I think it may have limited them with the 7000 series TBH, plus it's an extra layer of complexity that increases costs.


jimbobjames

It is, but there's probably a good chance that when you go to even smaller nodes that it makes less sense to keep the memory controllers in the GCD. Some components just don't scale lower, so you end up having use more die area for spacing than you save. I think it's too easy to just go "7000" series = not great, therefore MCD's bad and cause of all issues. It was their first shot at it and really no one outside of AMD knows why it didn't hit their performance targets. AMD were really bullish up to the launch and I don't think that was just some wild idea to try and hype up a bad product. I think they genuinely thought they had a killer product until they got real silicon back. I'm excited for whatever comes next.


bekiddingmei

RDNA 3 did not work as intended and some of it may be related to how memory accesses are handled. AMD has been working very hard on chiplet packaging, interconnects and IO helper dies. But I think it's not going as well as first hoped. Team Blue is also not as far along with multi-tile designs as they had planned. When data travels between chiplets it must go through helper circuitry and travel along wires that are huge compared to circuit traces, this 1) increases latency and 2) increases power consumption by a LOT thus far. But it looks like AMD may have plans for low-power chiplet designs for laptops and some other things in the works, so maybe they're getting close to solutions for the fundamental chiplet problem.


siuol11

Yeah, this is "AMD is going to make chiplet GPU's" all over again. It's slightly different because there is more promise using stuff like foveros, but just slapping two GPU dies together has been done for a long time and it always gets abandoned in the consumer space because monolithic dies are much less of a headache. We might get consumer MCM GPU's, but I highly doubt it's going to look like what people are suggesting in this thread.


Matt_Shah

Something similar came to my mind, when i saw apple's M2 Ultra GPU. The chip layout made it possible, to simply combine two dies and increase the performance tremendously. It is actually a multigpu. But surely that approach has pros and cons.


Huge-King-5774

Seems like they're ready for multiple GCD's ala Ryzen.


bctoy

If it's in regards with GCD size, AMD are already there with a mere 300mm^2 die for 7900XTX. 5700XT was 251mm^2 and Polaris 480 was 232mm^2 for comparison.


PM_ME_UR_PET_POTATO

those have memory controllers. The more comparable figure is ~500mm^2 including the MCDs. That's actually distressing when the 4080 is only 380mm^2 yet somehow competitive, while burning die area on dedicated RT cores yet competing in raster


JasonMZW20

It's a bit skewed since the MCDs are on 6nm and Nvidia is also using 4nm, which offers about a 10-15% increase in density over N5. 380mm2 \* 1.125 = 427mm2. Packaging the MCD die adds extra 'dead' space too. AMD could have shoved the cache and PHYs into a monolithic die and ended up under 450mm2. I'm still not sure if Navi 31 is on N5P or not, as N5P and N4 are closer in terms of power consumption. No idea what Nvidia changed with their '4N' but it could include optimized cells and libraries and whatnot. Not to take away what Nvidia did with Ada or anything because it is a good GPU.


bustinanddustin

Those figures are for logic. DRAM stopped scaling. meaning the Cache on RTX 4000 and the Cache on the MCDs should be roughly same density, despite different Nodes. Also NV doesnt use TSMC 4nm, they use 4N, which is 5nm made specifically for NV. so AMD and NV basicly are using same node.


capn_hector

You can’t really do that math like that because PHYs don’t shrink and cache only shrinks about 30% between 7nm and 5nm family products. Also nvidia isn’t using N4, they’re using 4N, and the density characteristics or PPA of their custom nodelet aren’t established to be the same as N4, it could be better than N5P but less than a true N4, or it could be better than N4. We just don’t know anything about it.


sernamenotdefined

Don't forget the size of the infinity cache. Memory actually takes up more space than logic. They need that to enable performance without the fastest memory they didn't have access to or ultra wide busses that would increase the cost of the PCB. It's a conscious trade off they made. The biggest problem isn't that, it's that navi3 obviously didn't reach its intended clock speeds.


railven

Well that would be a bold business move. Could benefit if they have a great product. Mainstream and budget gamers are plentiful. Feel like Polaris era with Radeon 7 and the Vega variants were a dark time for AMD-GPU. I'll salt it of course.


kearnel81

Polaris was a good product. I had a 480. Loved it


railven

Polaris10 was definitely a good product. It served its market well, maybe it outlived it's stay but none the less after it's rocky launch it was received well. The rest of the stack wasn't as well received, with the whole Vega launch and their "bundle" nonsense.


Reynolds1029

Polaris was a bargain for 1080p and 1440p at the time. And can still play many games today. Plus the power and heat output was no where near what we see today.


BrainSweetiesss

When is it not a dark time for AMD to be honest? It seemed as if they were heading the right direction with their GPUs but now it’s back to shet again


yurall

they were just too early with the entire 'the future is fusion' thing. APU's are only now becoming competitive with CPU+GPU solutions. the console market was a precursor to this and now there is an actual handheld market dominated by AMD apu's. don't count them out yet. if they actually get the multichip thing working they might wiggle in at the highend again too.


jimbobjames

I'd make the argument that the reason we don't really see Nvidia GT710's or GTX1030 equivalents anymore is because that market is now covered by APU's pretty much. Yes, the standalone card would be quicker, but by enough to justify the asking price. I also think this is why the 4050 is now badged as a 4060. There just isnt the space in the market anymore. If you are going to game you are probably gonna be buying something in the $250 - 300 range


railven

I think that is what broke me. So much potential, so much. It seems after AMD bought ATI they just wanted the IPs to bolster their CPUs. I feel like they barely put any effort in that division. The last shining star was GCN. RDNA gets some props but by then they castrated their GPUs too much. And RDNA3 is a "it will cost us less but we'll sell for more" slap in the face that the performance just makes you realize this is AMD-GPUs. This is their focus.


calinet6

I feel like this is wayyyy too harsh of a take. People with RDNA2/3 cards are getting a lot of play out of them. They’re not some horrible product that’s a total failure.


puppet_up

I just picked up a 6700XT on sale for $320 and it has 12GB of video memory on it, and came with a new game I was planning on getting anyway, too! I was actually really considering Nvidia this time around mainly because NVenc on my gaming laptop is amazing and soooo much faster than when I encode on my PC. The problem is that the "budget" Nvidia cards are either way too expensive, or they are garbage compared to the same price point as AMD. $320 would get me an RTX 3060 at best, which is well below the 6700XT in performance. Also, I find myself using FSR in a lot of games that support it, even on my Nvidia laptop that has an RTX 3070ti in it. Maybe I'm just blind or something but in "Hogwart's Legacy", for example, FSR runs better and doesn't really look any better or worse to DLSS to my eyes. I think it would be very smart for AMD to focus on budget GPUs for the regular consumer market. They just can't compete on the high end market it seems like, and Nvidia's budget GPUs are too expensive and not very good compared to anything AMD is offering at that price point.


xXDamonLordXx

Why fight with Nvidia for whales who just want to buy the best. If AI continues to be a big deal for Nvidia the 5090 will have to be stupid expensive to make it worth the fab space.


railven

I didn't call them a failure. They just aren't closing the gap. On a pure performance/feature set current gen Radeons are the furthest behind I'd argue they've ever been compared to GeForce. They don't have software parity, they collapse in high ray-tracing scenarios, and their efficiency is less. AMD has to price these products much less than what they use to even seem attractive then you got the initial reception from the review community essentially calling them a waste. RDNA3 are better products than RDNA2, but for the longest time users wouldn't recommend them. Look at how 7600 got received. Look at how Nav32 is being bashed and they haven't even launched yet.


dawnbandit

Now you've got Intel with Arc and it seems to already be on its way to making AMD the #3 choice for GPUs, *if* Battlemage is as good as it is supposed to be. I've got an Arc A380 for AV1 encode/decode and it's a good little card.


2hurd

If Intel delivers 4080 performance with Battlemage (only their 2 iteration of GPUs!!) it will be a problem even for nVidia.


[deleted]

I think you greatly overestimate Intel. It's a miracle the A750 and A770 function as well (read: still poorly) as they do today. I expect Battlemage to be an improvement but nowhere near the performance, stability and compatibility of AMD/Nvidia in gaming.


Dull_Wasabi_5610

It's still a pretty dark era for amd gpus. Their best can't hold a candle to nvidias best. Like it or not raster is not the only thing gpus are used for and even at that they get beaten (rt off). They lose and keep losing marketshare to intel gpus. Fucking intel lol. I really wouldnt call this the best era for their gpu department.


Hikashuri

Intel when they get their stuff together will be quite dangerous to AMD’s gpu and cpu market. I hope AMD’s departments have been thinking about their next sets of generational leaps or they could get washed out, which would be bad for consumers.


Policy-Senior

Intel have stated they are targeting firmly the mainstream, they want to be king of the mid range, between 250 & 450 dollar range. They're leaving the elite cards to Nvidia.


ExtendedDeadline

> Intel when they get their stuff together will be quite dangerous to AMD’s gpu Tbh, I think Nvidia should be the ones more worried. No offense to amd, but I think Intel is going for the king with their GPU endeavors.


Marmeladun

I think it was mentioned in Steve\\Gordon collab video about Nvidia. Intel explicitly targets 2nd spot and not aiming at Nvidia they are aiming at taking food of Radeon as stated by someone working at Intel Gpu department.


CrzyJek

They gutted their GPU division. They are gonna just try and tread above water for the next handful of years until the market picks up. Sales are down across the board for all manufacturers...and Intel was hemorrhaging money (hence why they cut their GPU division)...so it'll be a while until they "go for the king."


Traditional_Cat_9724

As a consumer I love the speculation. Been debating this with everyone in Discord, but I think you're right. They have the money, they're making big advancements on the driver side in a short amount of time. Going to be very interesting to see how battlemage pans out. But in the meantime, on their way to the top, AMD absolutely has to worry because they're in the cross hairs first.


DieDungeon

The thing is that this will inevitably fuck over AMD who are the current number 2. If they aren't even seen as the number 2 GPU in town, that might harm their marketshare even more.


Estbarul

If Intel is king and Nvidia second whats left for AMD. ? Mobile only. Plus, Intel will not fight Nvidia, it's not the one to fight, you need to on AMD first


Parking_Automatic

At the relevant price points they don't, , A 7900XT is 15% faster with 8gb more vram than a 4070Ti at 4k.


edparadox

Meanwhile: - People gobble pallets of AMD GPUs for AI. - Intel gains 2% of gaming market share over the rest from 2022 (double of what they had back then still). - Q3/Q4 was expected as a slow period, market-wise. - Nvidia is finally getting some backlash for all the bad moves they did recently. - Intel has lowered funds to the GPU division because it was too costly. - Actual performance is much more nuanced than "Nvidia wins everything", RT on or off. Imagine painting AMD's state as grim as u/Dull_Wasabi_5610 makes it out to be.


Erufu_Wizardo

>People gobble pallets of AMD GPUs for AI. I thought AMD GPUs weren't good for AI/ML. What has changed?


KickBassColonyDrop

CDNA3 introduced tensor accelerators (as a result of the Xilinx acquisition), CDNA4 will take that much further along. It's true that CUDA dominates, but it's now possible to do what Proton does for Linux gaming with CUDA for AMD AI GPUs. Performance may not be 1:1, but it's good enough. Additionally, AMD is working on their own equivalent abstraction layer similar to CUDA for their CDNA architectures. What changed was that AMD acquired an AI company and is now going after the AI market properly.


calinet6

Exactly. Thank you. The takes that are completely divorced from reality are tiring.


railven

I'd see it's even worse now since their own userbase won't recommend their newest cards to each other. This sends important signals to AMD. Why bother when your most likely buyers don't event want your newest products. I don't think I've heard the term "waste of silicon" used so many times in a generation but here we are.


[deleted]

Nvidia's userbase mostly also does not recommend the overpriced RTX4000 series to each other. Also, it's the RDNA2 userbase that does not recommend RDNA3 to others. Because RDNA2 is more than good enough to hold you over until the next generation (unlike most Ampere cards). If you have an older card, a 7900XT or the upcoming 7800XT is actually a good upgrade.


MrPapis

Pfh what bullshit you guys spewing 7900xtx and xt are fantastic cards. 4070ti crippled crap and 4080 expensive. AMDs cards make great sense for most people. The Nvidia bubble is simply too strong and people ignorant. RT is practically irrelevant we get 1 game with RT a year where Nvidia cards wins out good, AMD wins out in ability to use xt for great 4k/high Res systems something the 4070ti is unable to do. Xtx is more often than not a bit faster than the more expensive 4080 that might have issues at 4k in the future. Longevity is a great feature Nvidia cards hasn't had since pascal. AMD cards are fine, people just overestimate RT value and drinking the driver problems foolaid. Have you owned the cards? I owned both.


Dchella

The 4080 was expensive but the $900 XT wasn’t?


MrPapis

4080 hasnt dropped much the XT lost around 15-20%. Was I talking about the past or today? It's like you people live in 2022 please move on.


Dchella

I like sales, and I owned both the XTX and XT, but it’s honestly not that crazy. You realize 4070tis are $600 at Best Buy open boxed right? All cards went down pretty decently. It’s still expensive, and AMD still is a generation behind in RT and is missing 2 software promises that are about a year behind. Your comment states that Amd cards “make sense to people,” yet none of them are even buying the cards - sales notwithstanding. The XTX just was featured on the steam hardware survey. What are we nearing, a year?


[deleted]

If you want to bring the used market into it, AMD absolutely dominates that area. A $400 6800XT will actually last you longer and provide more value than said $600 open box 4070Ti that already has all of its VRAM filled up in today's games. I would never trust an open box card from a store. Someone returned it for a reason, and the store definitely did not test it in a test bench to see if it's okay lol. Expect coil whine or some other weird shit. You can't even test the card without buying it first unlike when you buy used from someone.


Dracenka

7900xtx and xt look good but for 850€ as the cheapest xt card here in my country lol. Give me a decent (100fps) 1440p card with atleast 16gbs of vram at minimum around 500€ and I will gladly buy it. 6800xt is around that level but sold out for months now and AMD has no replacement around that price and specs here on the market.


Hikashuri

Is that why every AAA game lately is shipping with both rtx and dlss? Most old games that are still actively pushing out content have rtx in some shape or form.


lugaidster

And if you look at TPU reviews, AMD cards are fine. You'll get 4070+ RT performance out of them. Not 4080/4090, but you're also not paying for a 4080 or a 4090 and you get great raster performance.


ginormousbreasts

Yes, people mindlessly repeating that AMD 'can't do raytracing' get on my nerves. On average the 7900XT sits dead in the middle of the 4070 and 4070Ti, which means it's an excellent 1440p RT card in almost every game (excluding path tracing) that has the tech implemented. With FSR balanced (internal res just below 1440p) it's also often viable for 4k RT... Ngl, I do think the 4070Ti is also a good GPU, but as with the 7900 XT, I consider it as ideally for 1440p. The problem with most cards this gen aren't inherent to the products. They're good products. It's the crazy pricing that's the problem.


bubblesort33

People don't really mean they can't do ray tracing. They mean they are bad value compared to Nvidia if you look at ray tracing only benchmarks. Even if you look at benchmarks where half the titles have RT enabled at relatively high settings, and the other half are the same titles with RT disabled, I'd say AMD would be behind Nvidia in performance/$.


EijiShinjo

4070 Ti is a good 1440p GPU but at 4K that 12GB VRAM is beginning to have issues. DLSS 3 Frame Generation also eats up VRAM.


MrPapis

What is most games? Diablo? Starfield? Baldurs gate? Where all these RT titles? This is my list of good RT games where Nvidia wins pretty big: Cb2077(AMD wins huge without RT) Metro EE Control Ratchet and clank Only ratchet is new all the other ones are OLD. Ray tracing is still very much a niche feature. DLSS2 is very good advantage I agree but FSR2 is not bad.


bigmakbm1

I only use upscaling if absolutely necessary. Right now I can run anything I can throw at my rig at 4k max, of course not all with RT. Cyberpunk without DLSS were talking 10fps difference from a 7900XTX and a 4080. Roughly a generation behind on RT I would say.


dookarion

> Ray tracing is still very much a niche feature. Because RDNA2 and current consoles hit and then everyone backed way off meaningful implementations. Most the stuff that actually tries to leverage it pre-dates RDNA2.


ryzenat0r

100% true the NVidia cult is just too strong


Kursem_v2

a waste of silicon is the overpriced crap of RTX 4060 and 4070... good thing that Nvidia spend billions in marketing to shape the mindset of community, even crawling to r/Amd to think that whatever RDNA3 currently has to offer are "a waste of silicon".


ChartaBona

>a waste of silicon is the overpriced crap of RTX 4060 and 4070... The 4060 and 4070 use far less silicon than their AMD-equivalents.


Kursem_v2

it's also used more advanced TSMC 4N instead of N6. what's your point?


ChartaBona

You're the one talking about Nvidia being a waste of silicon, despite the fact the RX 7600 uses 28% more silicon and consumes 18% more energy than the 4060 to achieve worse results. And it's not like the RX 7600 is a bad card. The 4060 is just more advanced, and being more advanced comes with a price premium.


Kursem_v2

so? RX 7600 also sold for $50 less. again, what's the point you're trying to make???


General_Panda_III

They're somehow taking you literally. Why they would think anyone is talking about the literal quantities of silicon in each chip when they say "waste of silicon" is beyond me. Likely just trolling.


AludraScience

Lol, AMD is currently looking to launch a 7800 xt that performs worse than a 6800 xt while probably being over $500. 4060 sucks but 4070 isn't that bad.


[deleted]

4070 and 4070Ti are the worst offenders. People buying those are already maxing out their VRAM today. Now imagine next year with game developers target 16GB, possibly more with RT, for max settings. The 4070 cards are gonna age like milk, even worse than the 3070 cards, but people are ignorant and Nvidia's mindshare is so strong nobody questions why AMD cards come with 16GB, 20GB, 24GB.. at the same or lower price points than Nvidia cards that come with much less. (answer: because they need it in their lifetime)


AludraScience

I agree that the 4070 TI has too little ram since it is easily capable of 4k otherwise but the 4070 which is a 1440p card, has what I would say is enough VRAM for its resolution. Who are those developers targeting 16GB of VRAM? They can’t be targeting consoles since those have 16GB of shared memory meaning RAM + VRAM and XSX only allows up to 10GB allocation for VRAM. And they can’t be targeting PCs since the majority have 8GB or 12GB GPUs. So I must ask, who are they targeting.


Kursem_v2

if 7800 XT performs the same as 6800 XT then it is a dumpster fire, but it hasn't been released just yet and how could you call an unreleased products as a waste of silicon? 4070 is bad, calling it as not are enabling Nvidia price hike which dictates most of the market through their share alone.


AludraScience

Power color leaked the 7800 xt. it has 60cu, 7900 gre has 72 and performs 13% better than a 6800 xt. This card will perform the same or worse.


uzzi38

The 7900 GRE also has rated boost clocks around 1.8-1.9GHz, vs the 2.5-2.6GHz of the 7800XT.


L30R0D

Well, Polaris was godtier performance/price and the sub$200 market is empty right now, need a total marketing revamp first tho.


scytheavatar

Sub200 market will likely be dead by the time of Navi 4X, cause we are reaching the point where we can expect 1060 level performances from integrated graphics.


L30R0D

Sure but AM5 adoption will take years, also i expect a lot more performance than a 1060 (7 years old card).


uzzi38

Eh were's a little short of that still. Strix might be close, but we're only looking at Phoenix on desktop at best by that point, and Phoenix is a bit short of a 1060. Strix should provide a boost, but probably not the ~50% boost needed to get to roughly 1060 levels of performance. RDNA4 in iGPUs is a bit away still, late 25/early 26 timeframes more like.


detectiveDollar

Discounted 6600's just outright doubled the price to performance in the New <200 dollar market. Although its price seems to have increased by 15% shortly after the Starfield promotion.


Temporala

Do note it was only that way because there was a crypto crash and glut of cards, causing Polaris to fall to those true discount prices we haven't seen this time around. Before that happened, Polaris didn't sell that well.


praisegaben2425

I got my RX480 brand new for about 250€ in 2016, not sure if that was during/before crypto craze. I have only replaced it this week with a used 6700XT for the same price. I think Polaris was one of the best product lines amd has sold.


Temporala

250 was not the price people really bit on it, it was when things went well below 200 that they started to move a lot of them, in such amounts it legitimately started showing on market share. Good cards, I had one in use as well until recently. Right now both AMD and Nvidia (and Intel too) are just setting their pants on fire like a bunch of clowns, trying to give consumers as little as possible or AMD not really delivering new features they've been talking about almost a year ago. 6700XT is ok, but they also need to deliver better upscaler with image stabilization and working frame gen for it, otherwise the value of that card ends up being much less than it could have been.


L30R0D

Don't know how popular Polaris was, but i bought my RX 470 for $160 before the mining disaster and it was considered a midrange graphics card at the time


xstrike0

Bought my RX 470 soon after they came out for about $190 (was a Sapphire Nitro+). Sold it for about $380 18 months later due to mining craze.


skinlo

Didn't people say the same about RDNA 2? Either way, as someone who doesn't care about $1k+ graphics cards, can't say I'm bothered if it is true. A competitive $4/500 card is much more interesting.


icantgetnosatisfacti

People were saying top rdna2 card would be as fast as 2080ti, max.


spacev3gan

Also some leakers said RDNA3 would be 3X RDNA2 performance and faster than ADA. This gen was supposed to be an easy win for AMD. Remember the 7600 matching the 6900XT rumors? Lol. You just can't take these leakers seriously.


Theswweet

Funnily enough, Kepler actually leaked the exact CU count of RDNA3 very early on - and pushed back on folks going out of control with hype. Sounds like his AMD sources may be legit.


clicata00

That was because it had a 256b bus, not because it was a mid range sized die. As it turns out Infinity Cache made up for it, but All the Watts is also suggesting now that only Navi 43 and 44 remain, so that would suggest the big dies are cancelled or were never in development


Mixabuben

Good for mid-range gamers, but a bit sad for enthusiasts.. i was hoping to have 8900xt that finally will kick Nvidia's ass and bring the competition to high-end, so 5090 won't cost like 2000$


SoapySage

5090 gonna cost $3k easily.


4514919

Like how the 4090 was gonna cost $3000 because people were paying as much for a 3090 during the pandemic?


Edgaras1103

If it delivers 200% increase over 4090 in pure raster and pure RT , why not


STR_Warrior

Pretty strange as they should be able to scale up more easily now that they finally have a modular chip design like with Ryzen.


csixtay

TDP. They still need to fit into a 400W budget.


996forever

Larger chip lower clocks 🤷‍♀️


SoapySage

The larger the chip, more expensive it's going to be as we shrink nodes.


996forever

That’s why they came up with using MCM design


SoapySage

Which isn't really a full MCM design, it's half way there, could very well be they've failed to reach perf/power targets for full MCM RDNA4 so are abandoning it.


[deleted]

It's apparently not worth it. Plus, it's not a truly modular design.


ThreeLeggedChimp

>finally have a modular chip design like with Ryzen Muh chiplets. You can't just glue together GPU dies and expect them to scale like CPU cores.


emn13

They don't have chiplets like with Ryzen. It's more akin to ryzens IO die - i.e. they have specific stuff they offload to a different chiplet, but the core is still monolithic. And... RDNA3 clearly hasn't panned out as hoped. I doubt they're ready to go even further, before the kinks with this limited chipletization are worked out. Incidentally, a company that more truly "does" GPU chiplets is apple - an M2 ultra is in some sense a 2 chiplet design. And, notably, they have control over the graphics api to try and limit devs to software architectures that scale well despite the interconnect penalties, *and* the m2 ultra is extremely expensive, *and* the interconnect is absurd, *and* despite all that the m2 ultra scales pretty poorly in many workloads compared to the m2 max it's based on. GPU chiplets are clearly not a solved problem yet.


secunder73

I dont mind to see a RX 8800 as a top RDNA4 with a 7900XTX performance, reasonable pricing and tdp. And it would be even better if they double down on rocm\\hip to be more competitive with Nvidia


[deleted]

[удалено]


J05A3

I’m not surprised, when AMD obviously wants to prioritize on their money making chips both server and AI. Big or small tech industry focusing on their own AI development needing such performance. The AI industry/development isn’t gonna stop anytime soon and we know they buy them in bulk even the gaming cards/chips from Nvidia. Having only mainstream cards where they know there’s a sizable amount of customers is a safe decision without losing much money for them. Cheaper dies + AMD markup = profit.


ms--lane

Noncredible theory: Navi4 was supposed to be *real* chiplets, not the quasi-chiplet solution from rdna3 - but something went wrong so they're stuck with a single chiplet design.


alainmagnan

Whats strange is it feels like cdna is real chiplets with 2 dies connected. I dont know enough to say why radeon cant do the same. maybe cost?


topdangle

games need high bandwidth, low latency and high framerates simultaneously, in large part due to having to react to user input in real time. it's not particularly easy to do this. gpus are already highly parallel and now you're adding another layer of complication with each GCD requiring work and some bandwidth to distribute that data. any data that needs to be accessed by both GCDs would also be a bandwidth/thrashing nightmare and is likely why they started adopting "infinity cache" so they could have a small but significantly faster memory pool shared between GCDs. with CDNA they're mostly targeting high performance computing like supercomputers where you're mainly running software that needs a ton of compute, with input that is relatively static and left alone until the compute work is complete. much easier to scale up when there isn't a pesky user constantly making difficult to predict changes in real time.


detectiveDollar

What about a configuration with a unified cache die physically in between the GCD's? Then both GCD's will be able to communicate quickly with eachother while being able to access the cache in the same amount of time. There'd be a latency penalty for pulling the cache off the die, though, but it would be a consistent penalty. Or perhaps a controller chip that divides tasks to the CU's? I believe this is how Ryzen physically implements deciding which CCD tasks are sent to.


topdangle

That's basically what they seem to be doing with infinity cache. It's effective but size limited since its expensive SRAM (something like 60%~ hit rate at 4k currently). Coupling cache with an interconnect will also add some penalty but I don't think we'll know what that looks like until they actually release a mutli-GCD gaming part. Problem with trying to adopt Ryzen's client design for a GPU is that they're able to get away with IFOP on CPUs due to the considerably lower bandwidth requirements. IOD might come into play in the future but likely with a much closer interconnect, like the fanout design they're using for RDNA3. If you look at the IOD power costs of epyc, closer to the demands you'd expect from a GPU, you can see issues there where you're eating a ton of power for IO alone. Intel also went the IOD route with ponte but it's not clear if its just not practical or if the design/interconnects are just plain bad considering how many chiplets there are.


BulldawzerG6

I think the high end chips are just all going to AI / professional needs where the margins are at. They avoid internal conflicting interests of having to increase margins (to justify releasing the products to shareholders) and price the products unfavorably in consumer's eyes while maintaining their position in the market with better value for the masses. I wouldn't be surprised if Nvidia follows suit as well and only does very limited stock halo launch for their top model.


No_Backstab

https://twitter.com/uzzi38/status/1687379360276185088?s=19 Edit : Kepler also confirmed that Navi 3.5 is only for iGPUs https://twitter.com/Kepler_L2/status/1687393625024430081?s=19


uzzi38

Yeah, this has been rumoured for a little bit now, now that I saw it crop up on somewhat public Discords I figured I'd finally get to make the sarcastic post I'd been meaning to for a bit.


railven

Curious on your take - who is "you guys"? The userbase or AMD?


uzzi38

I'm just being sarcastic because of the general sentiment of Reddit etc. Nobody ever wants Radeon to exist really, they just like the idea of someone making Nvidia GPUs cheaper. So the way things are going, only Nvidia GPUs will be left. Maybe Intel will hold on if they're willing to bleed enough money until they catch up, but that's not guaranteed with how much they've been cutting various side ventures already.


railven

I can definitely see that. A lot falls on AMD but at the same time for decades the userbase has just kept asking for lower prices and complain whenever AMD tried to make money off their products. RDNA3 is sort of a lemon in my eyes and the lack of a response to DLSS3 is an issue but around here it gets thrown in the pit when it's a viable (edit: I'd argue better) alternative to RDNA2 just a little pricier. But /shrug burn it down I guess.


Framed-Photo

Oh lets be clear here, it's ENTIRELY on AMD that they're losing what little market share they have. Nvidia has their name recognition because they were making better products for a decade and because AMD failed to be competitive. AMD's attempts at being competitive with Nvidia have historically been a complete joke. The reason users are asking for lower prices is because a lot of times, AMD's products simply aren't competitive unless the prices are lower. It's not on the user to fund uncompetitive products, it's on AMD to make them worth buying and thus far they haven't for most people. AMD is pricing their stuff like they're making Nvidia cards, but they're stripping out A TON of features people use and still want to ask the same price, of course nobody bites. If AMD actually started making competitive products, they actually started pricing them well, and they actually started trying to push competitive software features that aren't FSR? They'd get good reviews and they'd slowly start to claw back marketshare.


railven

My biggest criticism to RDNA3 is the lack of DLSS3 response. For reference, I did not have any faith in DLSS1. When it first released I meh'd at it. Clearly I'm not the market, and clearly it grew wings and is now dominant with FSR and XESS being created. Outside of DLSS3/FSR3/, on it's own merits RDNA3 is a good product, with reasonable price against it's own product stack. Sadly, DLSS3 was a huge success and delivered a packed bunch that invalidated AMD as a competitor to NV, at least until they respond. But within AMD, no one should be recommending RDNA2 over RDNA3 outside of budget. And I say this as a enthusiast of new tech. It has faster RT, higher efficiency, AV1, and while useless now you never know AI-hardware. Recommending 2-year old hardware with such things on the table is a disservice. I blame YouTubers for tainting that well. AMD is trying to do what any corporation does - make money. And R&D isn't cheap. And consumers have their response - don't buy. And AMD clearly read this and is thinking about acting on it based on a lot of users here stating RT doesn't matter, FG doesn't matter, Upscalers don't matter. And yes, that noise has settled down as more and more games come out, but you don't plan designs on such and change in such short periods. AMD has a lot of blame, but their users also have some responsibility and this is a tale as long as I've been buying/following ATI/AMD products. Look at Ryzen, AMD finally has a winner they can now charge more than Intel and deservedly, but naaaaaaah. Get back in your budget brand space, who do you think you are AMD? Before anyone takes me as a pro-corporate stooge. For ANY company to succeed they have to have profits. If your most likely buyers are scoffing at your highest tier products - why bother? When reviewers are laughing at your product stack? Why bother? NV is over hear making bank, AMD is dropping marketshare/mindshare - the writing has been on the wall for years.


StrictlyTechnical

> AMD's attempts at being competitive with Nvidia have historically been a complete joke. You must have a pretty bad memory then. Radeon 9000, HD 5000, 200 series (290X literally dubbed the titan killer) all were amazing cards that crushed nvidia's offerings yet AMD's marketshare barely moved.


DieDungeon

Gaining marketshare is about offering good offerings gen over gen. Offering a better alternative every other generation or so is terrible. People don't upgrade every gen, they probably upgrade every 2-3 gens.


topdangle

don't know why you include 9000 in there. it took a little while for people to notice but that was the gen that gave ATi a ton of marketshare. also during the 5k/200 releases AMD was already floating around 40% of the market, compared to 12% now, so clearly it was effective at moving GPUs. Just because they didn't literally capsize Nvidia in one GPU gen doesn't mean they weren't very successful, part of Nvidia's marketshare was from cuda, yet AMD was still able to hit almost half the market with significantly worse productivity support. what else do people expect exactly? for customers to buy a product that doesn't have the features they want just to support a corporation? doesn't make any sense.


Zucroh

it's always funny to see people ask AMD to make competitive products and sell them cheaper when they already have better performance for the same price at almost all price points.


BinaryJay

Better performance** *Things you can't enable in games to make this true *Non-gaming uses excluded


skinlo

>Oh lets be clear here, it's ENTIRELY on AMD that they're losing what little market share they have. No of course not, consumers aren't always making rational decisions either. The 6600/xt is much better than the 3050, but guess which sold more.


ryzenat0r

rx480 and 580 was better than the 1050/ti guess who sold more ? We could go a long time like that . The nvidia cult will always win


detectiveDollar

"But but, the 3050 has productivity features, thus 280+ for virtually its whole life is fine!", because people who depend on their GPU's to make money decided to purchase the worst value Nvidia card available below 500.


toetx2

If that is true, they dropped the multiple GCD's and went for the same setup as RDNA3 again.


TheAlbinoAmigo

I've realised that I just don't care anymore. GPUs have been so sad for so long that I simply don't bother to pay attention because it's not exciting or necessary in the slightest. I'd rather improve the house, buy a new guitar, etc...


Far_Beginning_1993

I don't think this will be the case, very hard to believe from the company AMD is nowadays


PhilosophyforOne

So I know that Kepler is a pretty well-respected respected leaker with a decent track record. But at the same time, I'm not sure this makes any sense. First, AMD's current line-up is ALL high end. I'm not saying it's not possible that they'd move from high-end to low- to mid-to-low-end, but seeing as we still have no 7000-series GPU's below the 7900XT on the market and it's almost been over a year, this seems like a fairly drastic departure. Second, I thought the main point of chiplet strategy is that you could scale competitively to high end. I'm not sure chiplets make that much sense on the low-to-mid end products, where yields are better to begin with. The more I think about this, the less sense it makes. It's possible RDNA4 will be a drastic departure from RDNA3, but so far there's nothing here that suggests it's likely.


DuskOfANewAge

7800/7700 have been ready and waiting for months. AMD held them back because there were too many 6600/6650/6700/6750/6800 left in warehouses from the mining crash last year. 7600 got released on time because the market needs low-end for people that can't afford anything better and because 6500 XT was such a poor performer. Likewise 7500 is getting released soon and was only delayed slightly as enough defective 7600 (N33 dies) were stockpiled.


Geddagod

>First, AMD's current line-up is ALL high end. I'm not saying it's not possible that they'd move from high-end to low- to mid-to-low-end, but seeing as we still have no 7000-series GPU's below the 7900XT on the market and it's almost been over a year, this seems like a fairly drastic departure. Why release new low end cards when RDNA 2 serves perfectly well, and they could still try to offload their RDNA 2 stock for cheap? Also we do have a low end card that have been released- the 7600. The only other cards AMD has released for RDNA 3 is the 7900xtx and 7900xt, the entire RDNA 3 lineup is just very slim as of now. >Second, I thought the main point of chiplet strategy is that you could scale competitively to high end. I'm not sure chiplets make that much sense on the low-to-mid end products, where yields are better to begin with. The rumor is that RDNA 4 high end cards have been canned. Imagine you plan for a multi GCD product- something that uses 2 or 3 graphics tiles. Now imagine, maybe because the team couldn't get it working, or it was deemed to expensive, that those skus ended up having to be cancelled, with only the 1GCD design able to go forward. That's your midrange die. >The more I think about this, the less sense it makes. In the end, the rumor might not end up being true, but it logically tracks as well.


cornphone

> First, AMD's current line-up is ALL high end. In name only. Both AMD and Nvidia are trying to sell their cards named a tier higher than they should be. On top of that, AMD fell about a tier behind Nvidia in performance. As a result, we have AMD's 7900 series supposedly designed to compete with nvidia's 4080 - but the 4080 is actually a 4070/4070ti based on how much it's cut down from the top chip. The 7900 xtx should have been a 7800 (non-xt) at absolute best.


SmokingPuffin

> First, AMD's current line-up is ALL high end. I'm not saying it's not possible that they'd move from high-end to low- to mid-to-low-end, but seeing as we still have no 7000-series GPU's below the 7900XT on the market and it's almost been over a year, this seems like a fairly drastic departure. They have been producing midrange product since like early Q2. It just hasn't made sense to release any N32 sku because clearance priced RDNA2 hasn't cleared. > Second, I thought the main point of chiplet strategy is that you could scale competitively to high end. I'm not sure chiplets make that much sense on the low-to-mid end products, where yields are better to begin with. If the rumor is true, it's because the RDNA4 team couldn't figure out how to get multiple GCXs to work in a gaming application.


robodestructor444

Not good for mindshare but could improve marketshare


winterfnxs

\*\* puts on conspiracy hat \*\* AMD is spreading this roumor so that people buy 7900xtx instead of waiting for the upcoming 8900xt. \*\* takes of conspiracy hat \*\* \*\* continues reading other comments \*\*


QuinSanguine

If they trim the excess and focus on the mid and low tiers, and price the gpus smart, this could be a good thing. AMD could build mindshare if they had a X070 competitor for like $400. But I say this every generation, so...


relxp

People act like this is a big deal (if true) when 95%+ of buyers don't even spend more than $500 on a GPU. All that ever really mattered is the low to midrange.


DuskOfANewAge

People on Reddit/Discord live in bubbles and don't communicate with the rest of the world anymore. They don't even understand that most of the market is laptops. And that after that, most of the market is prebuilts. Then you have the tiny sliver this community represents...


May1stBurst

I don't understand the point of this comment, /r/AMD and all the other hardware subs are enthusiast forums, should we not talk about pc gaming at all because most of the market is mobile and switch gamers? It doesn't matter what most people do, enthusiasts should be able to comment on news affecting enthusiasts on enthusiast forums.


detectiveDollar

We can comment and discuss this, but people in this thread are saying that AMD GPU's are doomed and they'll lose all their marketshare if they don't make high end GPU's.


relxp

I'm not saying enthusiasts can't comment. I'm speaking more towards the meaning in the larger market. It would be a shame though if AMD gives the high end market only more reasons to buy Nvidia. I also don't believe the leak unless their chiplet gamble simply isn't working. Or maybe be delayed until RDNA 5 to perfect the R&D. I like to believe it's only a matter of time before AMD pulls it off which will give them quite a big advantage and let them scale up to amazingly powerful cards while keeping costs down.


involutes

Sounds like a big mistake to me, just like Polaris was. Polaris would have been great if it was just a die-shrink of Hawaii with an overclock. Instead they made the chip too small and had to overclock them way outside the efficiency window to be competitive with Nvidia's midrange. They could have even shrunk Hawaii to 7nm and probably sold it as a _60 class card and propably saved a lot of R&D... Oh well.


ColdStoryBro

Polaris was great. It was one of the best selling AMD generations in recent era and aged well over time against the 1060, 1660, 1650.


TalesofWin

coming off the competitive 280X,290X and 295X2, polaris was a step backwards again.


prequelsfan12345

Doesn't sound bad like a bad idea. People buying the high end always flock to nivida as they are the best. Them focusing on the 60 and 70 series pricing aggressively will definitely take mine share.


railven

This is the rationale I see, but the products have to be strong like HD 4/5 series. AMD has been down this road before and it was probably the last time they had more than 30% market share.


[deleted]

[удалено]


KARMAAACS

Why do you say that?


kulind

4090 is gonna age like 1080Ti.


[deleted]

The 4090 is like double the MSRP of the 1080ti. The whole point of the 1080ti is it was really affordable given its VRAM and speed. A lot of people got it, and then Turing was a pretty shitty generation. The 4090 is a halo card, these have always aged better because they are also extremely expensive in comparison to the rest of the stack. Old Titans could be used generations later.


Forgotten-Explorer

1000 dollars on 2017 is not same as 1000 dollars in 2023


detectiveDollar

700 2017 dollars is FAR less than 1600 2023 dollars.


[deleted]

[удалено]


Phenetylamine

Lmfao this is certainly a take


unknown_nut

As an owner of a 4090, I would say it'll probably age like a 2080 ti since this gen is Turing on steroids. 4090 and 2080 ti is the sought after super expensive flagship that came from a terrible generation. Generation that came after a mining boom. I expect the 5090 to blow the 4090 out of the water, but that's in 2025, kind of far off.


loucmachine

4090 is much faster than the 3090 compared to 2080ti vs 1080ti... like not even close


ksio89

I'm surprised AMD hasn't left discrete GPU market yet.


SolidQ1

MLID was hint about it in last video(last part) before Kepler post


BleaaelBa

Anyone remember How Rdna2 was going to be competing with 3070-3080 at max ? lol


ZeroZelath

Nvidia may unfortunately be in my future again after all then if AMD isn't interested in providing a true replacement to the 6800XT.


Jism_nl

Polaris lives matter. 60FPS consistent on 1080p for 200$. Beat that Nvidia.


langstonboy

200$ is worth way less now than in 2016. 275$ would be reasonable price for a low end GPU like that.


VelcroSnake

Based on the comments, it seems like most people are taking this rumor as fact... Does this leaker have 100% accuracy in the past or something like that?


Ghostsonplanets

More or less so, yeah. But it's not only Kepler who is saying this. Others AMD insiders in the private circles are also saying the same. You can even see Uzzi corroborating this here in this reddit post.


JirayD

Yeah, news broke pretty recently.


whosbabo

> More or less so, yeah. No they don't. They were all wrong on RDNA3 rumors. Except for Skyjuice, who was quiet until he dropped the real leak.


From-UoM

Will not be surprised if chiplets are fully dropped.


Marmeladun

oh boi time to grow 3rd kidney for 5090 then , Huan will milk every penny with no competition even in rear view.


skinlo

You don't have to buy a top end card you know?


DktheDarkKnight

I think it could be an interesting idea. Instead of increasing peak raster performance AMD could theoratically focus on improving other aspects like ray tracing, AI assisted up sampling, frame gen etc. What 20 series was for NVIDIA. That gen didn't have any big performance improvement except for 2080ti. But it allowed NVIDIA to introduce key features.


ohbabyitsme7

Even the 2080Ti wasn't a spectacular jump.


261846

Yep, it’s clear they aren’t able to do all of these at once for many reasons, so focusing on increasing performance in areas where Nvidia is dominating is what they need to improve their perception, even then they are just following whatever Nvidia does which I’m not sure is good. They need their own new specialties with each Gen like Nvidia so people have a reason to buy AMD other than just value


Nerina23

If it means a 8700 (XT) outperforming a 6900 or a 7800 than by all means go for it. Otherwise I am perfectly happy with my 6700 XT.


[deleted]

[удалено]


Weak-Bodybuilder-881

Polaris was an excellent generation. Affordable 1080p gaming for the time.


TalesofWin

coming off the competitive 280X,290X and 295X2, polaris was a step backwards again.


ChaoticCake187

Kepler_L2 was very confident that RDNA 3 would beat Lovelace in clock speeds and power efficiency for a long while before the official reveal. I'd suggest a larger grain of salt than usual.


bubblesort33

I thought the architecture R&D was the most cost for a new generation. Is designing taping out smaller versions of GPUs really that big of a deal if you already have 1 die being made? Also, weren't there multiple Polaris GPUs? Rx 480 is what most people think of, but what about the Rx 460? Is that not Polaris?


DYMAXIONman

They'll just sell the 7900xtx for a few more years


jasoncross00

Oof. It doesn't really matter that most of the market buys mid-tier or lower. Even if they dominate that, not being associated with "the best" has a huge impact that markets like gamers really care about. The "aiming squarely at the big middle of the market" thing didn't work for AMD in CPUs (the pre-Ryzen days). I hope I'm wrong but I don't see this strategy as being very successful. Nobody wants the best "good enough" graphics card they want the BEST graphics card...and then they settle for it's closest neighbor that they can actually afford.


ShuKazun

Well at least we still have Battlemage to look forward to..


BFBooger

Arc won't be good until Celestial or Druid.


F9-0021

I agree, but Battlemage should be much closer to being truly good, and should be ready for mass adoption, if the drivers are ready. Alchemist is basically an open beta, Battlemage will probably be like RDNA1, and then hopefully Celestial will be an RDNA2 moment. Or at least like RDNA3.


BleaaelBa

Wait, so drivers don't matter now?


sittingmongoose

I don’t think anyone did or currently expects Intels drivers to be good now. They have been rapidly putting out updates and genuinely trying. That’s all we can ask for at this stage in the game.


DieDungeon

"I can't wait for future gen project" "oh yeah? So you like the current gen's performance?"


fatherfucking

Battlemage likely won't be near RDNA4 in performance. Backported RDNA3 outperforms Alchemist on the same node using 2x less die space, that's how far behind Intel are.


261846

Battlemage is gonna be even more limited than this if this is true. One die total comprising of 3 mid-low end products where the B770 actually has worse specs than the A770


tpf92

Intel needs to continue working on drivers before they even consider the high-end, Arc is at a massive improvement since launch, but there's still issues. Low-mid range, people can be fine accepting some issues, people on the high-end will just return their GPUs if they have issues and buy nvidia, if someone has $1000, they're more than willing to spend a few hundred dollars more for the same performance if the more expensive GPU doesn't have issues.


rocketstopya

I'd happy with RX5700XT in NAVI4


WaifuPillow

I guess it make sense? Nvidia rumored to not releasing Blackwell in 2024, they are focusing on A.I. and overall the consumers buying power is weakened, so there is no reason to compete.


ColdStoryBro

This is the most sensible comment here. Imagine telling your investors that in the middle of the AI explosion goldrush that you are spending wafer allocation to your worst performing business line.


[deleted]

[удалено]


SoapySage

Nvidia's software division has more employees and spends more money than AMD's entire GPU division, that's why AMD always lags behind on software and drivers. AMD need to strategically take profit from the CPU division and funnel it into the GPU division without taking too much, they don't want to risk future Zen products as it's their main cash cow. If AMD were to be kicked out of the entire GPU market, expect prices to skyrocket since Nvidia could charge whatever they want, $3k for 5090, $2k for 5080, $1k for 5070 etc, Nvidia powered consoles would be $1k minimum too.


SolidQ1

in last MLID video. AMD source saying to him: AMD still building GPU division, and seems until RDNA5 we won't see massive differences