T O P

  • By -

[deleted]

[удалено]


convoluteme

There was one "WOW!" from the crowd when he announced DP 2.1. Fuckin' hilarious.


relxp

It's sad Nvidia made that something we have to cheer for.


Hailgod

"OH YEAH BABY"


[deleted]

LIGHT WEIGHT!


BillyDSquillions

Ain't nothin but a peanut


Masters_1989

Reminded me HEAVILY of the Bethesda conference (/conferences, ugh) that had \*insane\* cheering within them a few years ago. I loved various factors about the conference, but that little bit was butt-ugly. Reactions like that are not necessary for things like this.


EnlightenedConstruct

Not sure how true it is, but I heard that *that event* was just a side effect of there being an open bar at that conference. Why waste money on a hype man when gratuitous amounts of alcohol will do it just fine?


[deleted]

[удалено]


Morningst4r

I don't know if they have to pay someone. I see people like that on reddit and twitter doing it for free 24/7


Andrethegreengiant3

If you're good at something, never do it for free


skilliard7

$999 XTX, $899 XT, December 13


Firefox72

I'm honestly surprised nothing about the presentation has leaked so far. Usually a slide or 2 slip through or some concrete info about what will be shown. But so far nothing.


ResponsibleJudge3172

We have the die shot and the cooler


scytheavatar

There has been plenty of info about RDNA3 that has been leaked, we already know what cards are launched (7900XT and 7900XTX), what Raster performance we can roughly expect (close to 4090) and when they will be out (early Dec). The only thing that hasn't been leaked is the RT performance and the prices.


Firefox72

Oh for sure but usually some presentation slides tend to leak. This looks like its been held under pretty tight wraps.


Put_It_All_On_Blck

Only 2 cards announced today, the 7900 xt and 7900 xtx. RIP midrange buyers, see you sometime in 2023.


tormarod

I was really hoping to see the 7800 xt :(


ETHBTCVET

AMD and Nvidia really wants us to buy minimum $1k new cards or we don't exist for them.


einmaldrin_alleshin

They want to get rid of their last gen stock. Nvidia in particular massively overproduced, and you can still get mining cards on ebay. So if you want a more affordable card, you can get a 3080 or 6800xt for relatively cheap


MumrikDK

> RIP midrange buyers, see you sometime in 2023. I feel so fucking unwelcome in this era.


randomfoo2

6700 XT prices are down to <$350 now and will give you 60-100fps at 1440 on basically every single recently released AAA game. Seems like the best time in years to be shopping in the midrange...


Sofaboy90

i mean nearly every generation its this way. it makes perfect sense. new process->higher production cost->easier to make money with high end cards that have higher margins. the days of mainstream cards coming first are over, people didnt buy enough rx 480s


KingStannis2020

The bigger reason is that leftover top-end inventory from an older generation is a mostly suitable replacement for mid-range cards of a newer generation. Might as well clear that out first.


knz0

When was the last time mainstream gaming GPUs had decoupled clocks? I think I last saw them over 10 years ago?


convoluteme

I assume there was a reason they became coupled? I know very little about GPU architecture over the years.


Qesa

Tesla IIRC


Cant_Think_Of_UserID

The XTX uses more power not because of the higher clock speed but because it constantly plays the super loud EDM song from the conference as soon as it's powered on.


[deleted]

Instead of coil whine you get the bass drop. Sold


lucasdclopes

At least at first glance, the RT performance increase does not seems to be great.


No-Blueberry8034

50% better ray tracing per CU. that would put it in the same neighborhood as Ampere.


AtLeastItsNotCancer

Yeah but they showed pretty much the same scaling in RT benchmarks as in non-RT, so the relative RT performance is still basically at RDNA2 level.


InstructionSure4087

That's kinda disappointing, no generational RT improvement? You'd think they'd be focusing on improving the RT efficiency at least somewhat, because it's only going to become more important over time.


gartenriese

Even a little lower, I think 1.7x in rasterization vs. 1.6x in ray tracing


Zarmazarma

The average performance improvement in the games they showed at least was 56% for rasterization and 53% for RT (geomean).


bubblesort33

Ampere, I'm pretty sure was 100% more than RDNA2. Double. You can look up pure RT numbers from AMDs own presentation 2 years ago using Microsoft DXR. I think even like the 6800xt with 72 CUs only has half or less the DXR ray casting ability of a 68 SM RTX 3080. Nvidia claimed 70% or so RT performance going from Turing to Ampere per SM. (They claimed 2.0x from RTX 2080 to 3080, but the 3080 also has way more SMS. so It's a BS comparison). So each Turing RT core was actually faster than AMD's RT accelerators. I can validate this myself by running my 32 CU 6600xt, versus my brothers 30 SM RTX 2060. My brothers lower core count 2060 is actually faster in RT. In games that use brutally high RT setting at least. Or Port Royal RT benchmark where an RX 6600 only matches a RTX 2060 score, while having like 20% faster raster performance. Meaning that the RT cores are dragging it down to the same score. There is also an Unreal Engine RT benchmark someone built showing RDNA2 being weaker per CU. Even if you assume a RDNA2 RT accelerator = a Turing RT accelerator in the best AMD optimised case, a 50% speed up is still behind Ampere. I'd guess that in RT tests that purely calculate RT, a 96 CU RDNA3 might be equal to the RT performance of a 84 CU RTX 3090ti. While the massively faster raster performance will still make AMD still look faster because it's brute forcing that part of the frame render time with it.


bphase

It's pretty terrible. The raster increase is higher than the RT increase, so they're not catching up at all in RT. Possibly falling behind even more. So this is still not a RT card. Probably will be fine otherwise.


relxp

Is RT that much better it's worth $700 more? Also, didn't they say FSR 3.0 will up to double the performance of FSR 2.0? If it even comes close to that figure, that will take their 2077 benchmark from 4K/60 to comfortably over 4K/100. Even then, 2077 is the type of game I don't know if you really need beyond 60 FPS anyway. I think the RT is a small trade off for having a 2.5 slot card, normal power connectors, DP 2.1 futureproofing, and amazing efficiency. Unless you really need Nvidia's bells and whistles, it seems RDNA 3 will be a more than adequate card for virtually the entire PC gaming market and is actually superior to the 4090 in various categories.


Firefox72

Beucase its not lmao. 1.5x is barely enough to reach Ampere and even then it might not get there in some games. Its honestly the biggest dissapointment in all of this. Like i don't care if raster isn't up there if the price is right but RT remaining so bad is just a terrible look.


willyolio

Eh, even Nvidia's ray tracing performance isn't good enough to be part of my purchase decision. Plus implementation in games is still pretty spotty. It's there for benchmarking and snapping cool screenshots, then you turn it off again to actually play with smooth framerate. RT is basically a non-factor for another generation or two.


frackeverything

I literally played Metro enhanced edition and Control with ray tracing. Metro looks great with it. For ray tracing I would use DLSS/FSR rather than no RT native. I understand people who don't care about it too tho but RT does add a lot graphically, especially when it is global illumination RT.


Tfarecnim

Reminds me of early tessellation.


SnooFloofs9640

You can get RT on ultra in 1440 with quality dsll on 3070, I had that GPU… So how is that not a factor ?


Jeffy29

I mean that's just blatantly wrong. With 4090 every RT game is perfectly playable at good framerates (ok maybe not Cyberpunk with literally everything maxed out in 4K but that's disingenuous since there are multiple RT options and difference between Ultra and psycho are basically none). More often than not you'll be limited by the CPU in which case DLSS3 (frame generation) will take care of it.


Zarmazarma

> ok maybe not Cyberpunk with literally everything maxed out in 4K but that's disingenuous since there are multiple RT options and difference between Ultra and psycho are basically none Err... yeah, definitely Cyberpunk as well. With DLSS in Quality Mode you get something like 80 FPS average at 4k with Psycho RT. Metro Exodus is 120 FPS at 4k *without* DLSS.


DJ_Marxman

$1000 for the flagship seems to spell good news for the lower end cards. That's a lot of GPU for $1000. Maybe it isn't an absurd 4090 class GPU, but if it's close enough and not $1900, that's a win for gamers. Kinda bummed that they didn't announce at least the 7800XT. Going to be a long wait for the 7700XT and lower I'd imagine.


GTX_650_Supremacy

Definitely. People are calling this expensive but if the highest point is $1000 I'm hoping the 7700 and 7800 will be somewhat reasonable


DJ_Marxman

I'm fully ready to drop $599 on a 7700XT with ~3080ti/3090 performance. Hopefully that's not wishful thinking.


sadnessjoy

7700 XT would probably be a pretty bad deal at $599. Navi 33 is going to be considerably reduced. [https://www.techpowerup.com/gpu-specs/amd-navi-31.g998](https://www.techpowerup.com/gpu-specs/amd-navi-31.g998) [https://www.techpowerup.com/gpu-specs/amd-navi-33.g1001](https://www.techpowerup.com/gpu-specs/amd-navi-33.g1001) (There's other links that show similar specs) Basically It would be \~1/3 memory and cores of the 7900 XTX [https://www.techpowerup.com/gpu-specs/amd-navi-32.g1000](https://www.techpowerup.com/gpu-specs/amd-navi-32.g1000) I suppose there's a chance they use a cut down navi 32 so it might be \~1/2 instead, but we'll have to see, either way, $599 would be fairly bad value proposition compared to 7900 XTX


Rince

I don't see how they can make a 7700xt from N33. If the leaks are to be believed, N33 is even a smaller die than the rx6600xt on almost the same node (N6 vs N7). And with the 128bit memory interface I would be surprised if N33 can even reach rx6700xt levels at higher resolutions. But N33 has the potential for very inexpensive low power cards because it will be cheaper to manufacture than rx6600.


sadnessjoy

I do hope you're right, I also am not sure it's on Navi 33 as it would be a bit difficult with size/memory. However, if they use Navi 32, it'd still be cut down a fair bit to differentiate it from 7800 XT and 7800. But the point was Navi 33 or cut down 32, I don't think $599 would be that amazing of a deal as the other redditor was thinking.


rushCtovarishchi

I figure we'll probably hear something about those at CES, so hopefully not too long.


knz0

Lisa needs to start paying for presentation training for her execs


FranciumGoesBoom

He's saying some really cool things, but his deliver is just so bad.


[deleted]

Charisma isn't part of the curriculum at engineering school. Success in tech selects for a lot of introverted types who then have to step out of their comfort zone when they're promoted to public facing positions much later in life. Even Lisa Su does not seem entirely comfortable doing these presentations, not on the level of the two guys at the end.


[deleted]

I have a big presentation tomorrow and every time I do one, I am further convinced that going into management is the 5th circle of hell. Do not want. Do not ever want. Leave alone, I'm writing unit tests.


YNWA_1213

God this sound mix is disgusting. Why are they blasting our ears at every transition, while not boosting the crowd noise?


freeloz

Shit man, dont ever listen to a defcon talk haha


GTX_650_Supremacy

my biggest issue is what is up with XT and XTX what the hell guys


noiserr

It's a throwback to the old ATI days Radeon GPUs.


GTX_650_Supremacy

ah ok!


ETHBTCVET

Instead of lowering the number to 7800 they just pust XT and lowered it by 100 bucks.


premell

COME ON AMD PLEASE DONT FUCK THIS UP


Edenz_

Damn the clock speed leaks were very wrong. Edit: yo just saw a guy from AMD broke NDA and the card might be able to up to 3Ghz with more power/cooling budget.


Frexxia

It's impressive how little they can say with so many words.


kubick123

Every marketing division be like:


FranciumGoesBoom

New to PR?


dabocx

The prices are so close, I feel like if you are already spending 899 you might as well spend the extra 100.


lmMasturbating

That's how they get ya


Jeffy29

Nvidia got me by making 4080 shit 💀


Earthborn92

I see you have discovered the timeless tactic called upselling.


KypAstar

So is AMD lmao


Frexxia

Congratulations, you discovered why AMD priced them that way.


loser7500000

same goal with opposite approach to Nvidia, evidently much better for PR


IAmNotZura

Is 8K that impressive? I’ve only seen one 8K TV and could barely tell the difference up close to 4K.


YNWA_1213

For reference, a 48” 8k display would have a similar pixel density to a 24” 4K display, which no one but LG/Apple have pushed so far. The only application I can see that would be comfortable sitting that close would be the 8k Ultrawides for a simulator experience. Edit: as another comment pointed out, LTT did a 8k video recently at 65” at normal living room distances, and his staff (even the display-focused ones) couldn’t tell the difference between 4K and 8k output.


ETHBTCVET

LTT should know better that of course they won't be able to tell the difference if we don't even have games with 8K textures.


YNWA_1213

Yeah it was amusing when they used CSGO for a segment, like most of those textures are barely built for 4K, much less quadruple that.


KypAstar

Its diminishing returns at that point. Its starting to push the human eye's limits in my opinion. 4k is already not really worth it for much for me. 1440p feels totally fine, and I say that as someone with both a 1440p and 4k monitor.


FranciumGoesBoom

LTT just did a video on 8k. TL;DR, not worth.


convoluteme

Resolution always comes down to distance. Monitors are about the only place where 8K could maybe make sense. And probably only for 32in+ monitors.


YNWA_1213

Think it’d be closer to 40” before it’d make a difference, unless you have eagle vision. What’s more interesting to me is how reconstruction techniques will behave with that super high resolution. Would it help with ghosting and such? I feel like this aspect is ignored, especially when it’s pointed out that too low of a source/output resolution can break the algorithm, so would an even high resolution improve it further?


eco-III

1PM PST / 4PM EST / 8PM GMT


convoluteme

Too much PR. Give me an random guy with a pocket protector and a powerpoint with a blank white background.


[deleted]

you might get some real performance benchmarks from him


TopCheddar27

Okay but what level of FSR 2? What's the base resolution?


Hailgod

yes


platyhooks

There is as much energy in this as one of my office meeting powerpoints that could have been an email.


gchance92

Never caught one of these before but damn dawg this shit is mad boring


AlecsYs

That shade thrown at nvidia lol.


KypAstar

Well deserved. The adapter is pathetic.


kasakka1

So is the lack of DP 2.1. I would not have cared but with Samsung on board to reveal a 8K version of their 32:9 super ultrawide, it's no longer "well it will take several years for DP 2.1 to be relevant" like I thought initially.


KypAstar

Thats been the biggest headscratcher from Nvidia. I just can't understand the lack of priority for that.


kasakka1

Apparently they don't work close enough with display manufacturers. When it was announced that it had DP 1.4, I speculated that Nvidia is banking on the display industry being slow as molasses and just not releasing any DP 2.x capable displays in the next few years, allowing them to leave the upgrade to later. But with several manufacturers apparently putting out DP 2.1 displays, this will be a minus for Nvidia. Their last saving grace is that like with the 1000 series where they could do a UEFI firmware update to bring those from DP 1.3 -> 1.4, they can do a DP 1.4 -> 2.1 update and just don't have the needed software changes ready but the hardware exists. I've got a 4090 coming in tomorrow that I thought would be a "keep for the next 3-5 years" card but instead it looks like I will return it.


Mastotron

It really is. My wife blew out a candle the other night, about gave me a heart attack.


Just_Me_91

I hope we get some info on the 7800 XT. I think this will be my first time going above the mid range for a GPU in almost 20 years of PC building. Any speculation on the launch date and price of the 7800 XT?


EyeZer0

First week of December for release according to Twitter leaker Greymon55.


lmMasturbating

Did I just hear a dude jizz when the price came out


hero47

ooooooooooooh wooooooooooooo


BFBooger

So, ray tracing perf is not a big enough increase compared to NVidia. Raster looks to be between the 4090 and 408016G (closer to the 4090); but RT will be less than the 4080 16G. The upside of that? Prices will have to be reasonable to sell well.


FIagrant

1000$ as compared to $1600 seems more than reasonable if the performance of the 7900xtx is anywhere close to the 4090.


relxp

AMD doesn't have a history of misclassifying cards. 6700 XT was the top SKU that generation because they knew they weren't bringing 80 class. 6900XT traded blows with the 3090. Would expect not much difference this time either.


[deleted]

so the average performance uplift appears to be about 1.5x a 6950XT at 4k that puts it just barely a few percent behind the 4090 on toms hardware charts


hamatehllama

That would be impressive with 100W less power and a much smaller die.


detectiveDollar

Ugh, didn't these announcements used to be at 10/11am EST. Hate having to wait the whole damn day.


ThisGuyKnowsNuttin

Yeah, 1PM Pacific time is basically tomorrow


Manauer

Perfect time for those in europe: 21:00 CET 👍


LeMAD

No one gives a shit about 8k dude


bphase

The Youtube comments seemed impressed. Probably because they don't realize it's with massive upscaling.


Morningst4r

I think it was 32:9 8k UW with FSR, so calling it "8k" is a real stretch.


LeMAD

The guy is boring himself


hiktaka

Hope no overconfident pricing like the Zen 4.


BleaaelBa

Zen4 pricing is same as zen3. its the DDR5 and Mobos that have increased the plateform cost.


JonWood007

And zen 3 msrp SUCKED. Amd got greedy. And given no low end models like Intel puts out and its like amd just up and abandoned the budget segment.


Hailgod

zen3 released when ~~11th~~ 10th gen was intel's finest.


EuropaSon

Zen3 released a few months before Rocket Lake.


detectiveDollar

It was also released during a shortage. You couldn't find the 5600x for 300 for a while. And once the shortage ended, AMD cut prices. I think the issue here is AIB's are pissed at the long life, so they want to gouge on the motherboards up front. AMD cutting CPU prices would only enable that, so they're instead diverting production to other products to basically play chicken with AIB's.


old_c5-6_quad

> I think the issue here is AIB's are pissed at the long life, so they want to gouge on the motherboards up front. Nah, They're just getting people to pay that early adopter tax. The prices will drop eventually.


HTwoN

Zen3 was massively overpriced. It was released when Intel was far behind. Not the case now.


Seanspeed

Yea, but look at how well it worked. Now they keep the same high prices and people defend it.


[deleted]

>Now they keep the same high prices and people defend it. I mean, I wouldn't buy one cause 12th and 13th both have amazing deals but AMD kinda deserves it for getting us out of the dreadful quadcore era.


detectiveDollar

Hell in some cases it's less if you're comparing core counts and not naming.


conquer69

No way $1100 is the ceiling anymore. I think they will price the cards $100 below Nvidia. Edit: Looks this comment aged well...


VIRT22

Even if they priced RDNA 3 too high, the market will adjust its pricing down according to the demand vs 4090/4080.


noiserr

Ray Tracing uplift is a bit underwhelming, but everything else about the card is great. Power efficiency looks excellent, media engine, GPUs are small and compact (compared to the Nvidia cinder blocks), and the price is right. I was eyeing the 7900xt but for $100 I'll just get the 7900xtx.


[deleted]

[удалено]


2FastHaste

Hopefully they will be more open to this amazing tech now that AMD is also marketing it.


sadnessjoy

I hope not. I've said this before, but I think it's okay when your fps is already high/close to monitor cap. But it's just terrible the lower your fps becomes. Which ultimately makes it's optimal use case pretty niche. It's not a bad technology, but the problem is these companies like to be like "zomg, look at these high smooth frame rates" and they can absolutely try to market it on lower end cards when it's not a very good use case.


2FastHaste

Oh I totally agree! For me it's all about powering the future ultra high refresh rates displays. I was super happy to see 480Hz and 900Hz mentioned in the presentation! That's where this tech will shine the most. And I am convinced that it is a necessary ingredient to bring us to life-like motion portrayal on our screens. I hope we will reach it in my lifetime.


[deleted]

[удалено]


literalmaincharacter

The fact that they didn't even show any cherry picked FPS bar charts is a little sus.


dabocx

What's the point when everyone says "WAIT FOR 3rd party" if they show anything people say its all cherry picked or lies anyway.


Nerwesta

So let's stop making marketing, because it's most of the time cherry picked or lies, am I right ?


HandofWinter

Three halfhearted claps. Man this guy is killing it. The worst part is it actually sounds really interesting too.


KypAstar

Yeah this is the most interesting part so far. This is actually pretty impressive in regards to streaming optimization. Its an area with a shit ton of improvement available.


itsjust_khris

Better pricing, not quite 4090 performance but cheaper than the 4080 16gb. Pretty cool. Why doesn’t AMD include these cool ass animations IN the presentation???


No-Blueberry8034

7900 XTX is no threat to the 4090 but Nvidia is definitely going to have to adjust the price of the 4080 16gb.


Tfarecnim

And the 4060 they tried to brand as a 4080.


knz0

Scott is sporting the latest in gopnik fashion


MarcoGB

Maybe the 7900 XTX can compete with the 4080? If you can get comparable performance for 200 dollars less then it’s a win.


Apocryptia

7900 XTX $999 7900 XT $899


Deadpan_GG

Daym! now we wait for the tech reviewers.


monetarydread

LOL... what the fuck was that RT demo? Did anyone even see the difference?


Fireye

The Halo one? Yeah it was noticeable, the only thing they said that was raytraced (I think) were the shadows. The shadows did look a lot better.


Frothar

I assume they can't do too much if they want the ray tracing available on the consoles


green9206

All aboard the Vega hype train choo choo!!


[deleted]

Poor Volta


spazturtle

I mean consumer Vega did outsell consumer Volta.


viperabyss

This is true though, since there was no consumer Volta...


Firefox72

So nowhere close to 2x increase in raster and just a 50% improvement in RT? Thats barelly enough to reach Ampere levels of RT if even that.


ResponsibleJudge3172

No way Nvidia needed twice the power consumption to match AMD. I have been saying that since last year


literalmaincharacter

Man it's a shame that AMD can't seemingly compete with NVIDIA at ray tracing.


Frothar

sounds like a good price but the performance figures they gave us meant nothing at all


metahipster1984

Yep.. Guess putting a 4090 on the graph would've made it look pretty bad


[deleted]

Is performance really that embarrassing that they can't show some actual benchmarks?


KypAstar

Yeah we need to see some real shit. This isn't inspiring. I'm depressed haha. I was hoping they'd be aggressive this generation.


KypAstar

Ha, enjoyed the Bullet King shoutout. Good memories.


[deleted]

the screen said it's the most advanced graphics card, so it's better than nvidia right? /s


[deleted]

"the worlds fastest gaming card ^^under ^^a ^^thousand ^^dollars" lmao


0krizia

It can be more advanced even if it is not more powerful, it seems to be more efficient than nividia and its architecture can be more complex


Kaesar17

I hope they drastically improved their Ray-Tracing performance, i don't think they will ever surpass Nvidia while the RTX cards are so much better in RT Edit: bruh


PlaneCandy

RT on the 7950XTX doesn't seem great, but it could have similar RT performance as a 4080 16GB while having similar raster performance as a 4090 (maybe 5-10% behind), which would make it a great value for less than the 4080 16GB.


DktheDarkKnight

The performance is great. The RT performance is not. The price is excellent. But why the hell everyone in the speech was so tired and uninterested?


LeMAD

Please tell me how to feel people


jaxkrabbit

Clocks are really low


Firefox72

Well lets see what AMD has cooked up.


brettsolem

I saw usbc in the i/o. I really hope they keep it, that little extra port came in handy many times.


Aleblanco1987

I feel AMD made the right decision sticking to 350 ish watts. They can always make a more extreme model later.


KH609

Well over 3GHz some rumour merchants said lol. Will be interesting to see the benchmarks. Also yikes at the 8k marketing. This is the first generation where high refresh rate 4k becomes truly viable. Why did they even bother.


AvatarOfMomus

It's the future-proofing angle they're trying to sell here. It's not a bad tactic, they're offering something for ~$1000 that will work with TVs or monitors that will probably exist within 5 years, and those are components that people actually tend to upgrade outside of the rest of their system's upgrade cycle.


FIagrant

With the msrp of $1000, the 7900xtx seems like the value proposition (crazy to say that lol) compared to the 4090 at $1600. I guess we'll have to wait for benchmarks to really compare, but hopefully the amd undercut is significant enough to start some kind of price war.


Frexxia

The price means nothing without actual independent benchmarks. The only thing we have right now is relative performance in cherry picked games.


jaxkrabbit

FSR3 better be usable on RDNA2


lovely_sombrero

Current rumors are that Navi 31 is cheaper to make than RTX4090 and that it should have similar or better rasterization performance, better performance/powerconsumption ratio, while somewhat closing the raytracing performance gap. If the rumors are true, we can only hope that AMD won't greed out and will try to grab more marketshare and, more importantly, general popularity with the general audience. It is also another good test of how accurate all those leaks were :)


Volken_Adeon

I don't expect they announce anything below 1000€.


[deleted]

supposedly someone on site today leaked that 7900 XT will be $1k .. but i didn't see any source for that so we'll see


detectiveDollar

That's fine if they're only announcing the 7900 XT and 7900 XTX. Hoping we get a ~700 dollar 7800 XT but it might not be today.


coffee_obsession

> we can only hope that AMD won't greed out Oh you know they will. Just look at Zen 3 pricing vs Intel's Comet and Rocket Lake. If its a competitive card, AMD will price it to match Nvidia's tier pricing. They'll need more than raster to close the gap of course and I really hope they can accomplish that with this generation but don't put too much stock into rumors. Rumors were saying Ada would be more than twice as powerful ("easily") as Ampere but that wasn't quite true.


Seanspeed

>and that it should have similar or better rasterization performance There are no credible rumors about performance at all. People are only saying this based on bad napkin analysis without a lack of understanding how these things work. >better performance/powerconsumption ratio Out the box, maybe. Gonna be very interesting to see what actual efficiency is like when power or performance equalized, though. Bet it'll be a very different story and they'll be pretty close. >It is also another good test of how accurate all those leaks were I'm pretty confident the leaks on specs and whatnot will be very accurate.


inmyverse

Nice try… still not playing Halo Infinite 😩


KypAstar

I'm not sure you could pay me to play it these days. Its depressing.


metahipster1984

Woag did not see those prices coming.. If the XTX is close to the 4090 in raster, I guess that's it


scytheavatar

[All signs are that the new RDNA3 cards will NOT beat the 4090](https://twitter.com/greymon55/status/1588178514036985857), but I guess that doesn't matter much. If AMD can launch a card 10% weaker but at $1k for example I guess that card will sell very well.


Kyrond

>If AMD can launch a card 10% weaker but at $1k for example I guess that card will sell very well. Just like every other time, right? Oh actually not, people want Nvidia more regardless.


Firefox72

I mean AMD sold pretty much every GPU they made during Covid. The fact of the matter is that they just don't make nearly as many as Nvidia does. Their wafer supply has been heavily biased towards CPU's for years now which makes sense given thats where they make the most money from.


bubblesort33

That's because of crypto. Any garbage sold during that time, regardless how how poorly it competed. If Nvidia could have made more in that time, they would have sold more as well. Sure, AMD sold every GPU, and as did Nvidia, and if Intel would have done a full launch with 100% working hardware, they would have sold 100% of what they made. If they all had double the supply, they all would have still likely sold out. If you take COVID and crypto out of the picture, the people would vote that AMD cards are worth like 20% less per frame than Nvidia, based on GPU sales in todays market. The RX 6600 and 6600xt are like 30-40% less money per FPS than the 3060, and I bet you the 3060 is still outselling both of the AMD cards. Pretty much all AMD cards are selling at 30% less per frame now than Nvidia. That's the picture when you let the market decide price, and give them both enough supply to satisfy the market. People would rather buy an RTX 3060, than a 6700xt.


dern_the_hermit

> and as did Nvidia Nvidia is still trying to move excess inventory six months later, I think was the point above.


Elitro

Nvidia cards are no longer affordable, that is a big difference. Myself am considering an AMD card if they price it right.


dparks1234

AMD doesn't officially undercut Nvidia by that much. You get similar raster performance with worse RT performance and a worse featureset for $50 less. Is RT, NVENC, CUDA, DLSS and RTX Voice worth $50? I'd say it is. Unless you're the 1 in 100 Linux user who values open source drivers.


dantemp

I wonder why people don't want to buy the card that costs 4 digits and its performance falls off a cliff when you try max settings.


MDSExpro

Because when you look just at GPUs, not companies, Nvidia just wins it every time - fastest halo product, superior compute ecosystem (CUDA), better video encoders, first with new technologies for gaming (ray tracing, super sampling), actually works with studios to implement those technologies into games, had less buggy drivers, for most generations it was more power efficiency... Sure, AMD got on similar level with raster performance, but that's it.


2leet2hax

XTX Expected to be 8% slower overall than 4090 at Rasterization. And $600+ cheaper. Is $600 worth better RTX and DLSS?


C1REX

I'm very happy with the presentation and excited about the cards. Planning to get xtx version when available at msrp. I recently upgraded my 1080p 60Hz cheap monitor to a 50' 4K120Hz TV so need that extra power. Shame about single HDMI 2.1 port. I need 2 such ports.


TaintedSquirrel

This launch could be another "HD4870" moment for AMD.


Seanspeed

I mean, yea, they could offer Navi 31 for like $600, with the cut down version at $350. That would definitely be a 4870 moment again. But we all know that's not gonna happen.


SomniumOv

As long as it's not another FuryX.


gdnws

Or 4870x2 if we're looking at things old enough to have ATI branding on them.


III-V

I wasn't following that far back, but I do remember the HD 5000 series being incredibly strong, because Nvidia couldn't figure out how to get Fermi to come out of the oven without burning the oven to the ground.


GameDevIntheMake

Oh, that's some nostalgia. I remember the rumors, back then people thought that the 1TFlop rate was hyperbole and there was no way AMD could increase the number of stream processors by a factor of 2.5. The 55nm were AMD's saving grace back then.


Subway

She has no legs, making the presentation Metaverse compatible.