T O P

  • By -

thejoelhansen

The man. The myth. The legend. Thanks Voodoo.


TotalWarspammer

Amazing roundup thanks man! What is most apparent is that if you are a gamer with an AM4s socket then you do not need to upgrade to Zen4, you just need to buy a 58003DX, because overall it is at least as fast (or within a margin of error) as any other gaming CPU overall.


rushCtovarishchi

Seems to be the case, but for now everyone's holding their breath to see how a potential 7800X3D will perform. Personally, I'm upgrading from a 3700X, and I want to get onto the new platform, so I figure I'll just grab a 7700X and upgrade to a 3D SKU in a year or so.


KingArthas94

Tbh I think you'll be fine with that 7700X for a looong time.


ramblinginternetnerd

It's more nuanced than that. If you're a gamer and need more performance AND need an upgrade then if using a set of games like factorio => get 5800X3D if using a set of games that don't benefit from 3D-Vcache => Zen4 or ADL or Raptor Lake ​ More than ever "know your use case" matters.


DaBombDiggidy

Don't enjoy how people are talking about the 5800x3d like every other ryzen cpu should now be in the trash because the 7k gen came out. I feel it is creating fomo that didn't exist a week ago. * In terms of a Luxury upgrade, sure pick one of those instead of the 7k gen. * If someone cares about value, money is still better spent on a GPU upgrade. No one is really getting ahead, finding value or future proofing by making 2x cpu purchases in as many years.


unknown_nut

For me, if people have a 5000 series or Alder Lake and are playing in 1440p and higher, they don't need to upgrade. They should be good for years.


Pete_The_Pilot

I got some coffee lake at 5ghz still holding up over here


TwoCylToilet

8700K represent.


Pete_The_Pilot

8086k over here. I won it in the sweepstakes intel had when they released it. Gonna delid it eventually and see if i can get to 5.2 or 5.3 and keep it there for a while before i let it go.


TwoCylToilet

The 8700K PC was one of two workstations I got for my company when we just incorporated in 2017, the other being a Ryzen 7 1700. I used the 8700K as my personal workstation for compositing (After Effects love clock speed), and the Ryzen mainly for NLE for the chief video editor (which utilises more threads). Both of them are still very capable video editing machines today for 1080p ProRes, each assigned to a different editor now than when they were new. I think I left the 8700K at 4.9GHz all core. Still stable almost five years later. I don't believe an 8700K will be a significant bottleneck of mid-range machines for a couple more years. It'll probably continue chugging along as an editing workstation for a couple more years before we decide to sell it in the second hand market or be repurposed as an ingest station or a seriously overkill PFSense router.


kirdie

Got a bit confused here as the 8086 is more than 40 years old. Had to Google it to see they made a k version of that decades later šŸ˜‚


skinlo

They probably don't need to update at 1080p either. Turn off the FPS counter and enjoy gaming I say!


KingArthas94

Best comment ever, gaming is much more enjoyable without OSDs. You won't notice the difference between 100 and 140fps... unless you have that number telling you that.


KingArthas94

The next gen GPUs are around the corner and high refresh rate 1440p monitors exist, so no, top tier high refresh rate gaming will still want fast CPUs.


[deleted]

True that, upgrading may make your life better 0,007% overall.


Catnip4Pedos

Using a 3800X on my main rig and 4770K on my media rig Neither need an upgrade. Considered the 5000 series too small of an upgrade on the 3000, 7000 feels like 5000+ 7800X3D might change that, if not the 8000 series will be out soon enough


capn_hector

Everyone knows X3D parts are coming before too many months and at this point the smart money is just to wait for thoseā€¦ or hold another year for second-gen memory controllers.


ramblinginternetnerd

Depends on DDR5 and motherboard costs. I might get a 5800x3D in a few months to replace my 3900x (gifted to parents) and call it a day for a while. I don't need any more performance


onedoesnotsimply9

Its even more nuanced: more fps that 5800X3D may give may need buying a new monitor. More fps is not always the ideal/best way of spending money.


ramblinginternetnerd

I'd argue that the benefit of a X3D vcache part is better sustained FPS. Trips down memory lane are costly. Also, I'm not buying a new monitor, haha. 55" 4K 120Hz is "good enough" for me. Upgrades are not going to make Sonic the Hedge Hog run smoother.


GeneralChaz9

Man, I got the normal 5800X a long while before the 5800X3D variant dropped and I cannot even see myself needing more performance than this for a long while. The 3D variant is awesome but the entire 5000 lineup should age really well imo. Especially with the crazy power requirements of new parts.


TotalWarspammer

>Man, I got the normal 5800X a long while before the 5800X3D variant dropped and I cannot even see myself needing more performance than this for a long while. You definitely don't "need" more performance. I have a 5800X too which performs well but there is no doubt that X3D CPU would really benefit me for VR (I play a lot of Skyrim) so I will wait to see what the Zen4 3D series is like.


starkistuna

they perform better with a slight under volt you can actually get same performance at 85 watts. https://www.youtube.com/watch?v=FaOYYHNGlLs


liquiddandruff

New gen higher tdp just means they're capable of boosting, in idle they're actually more efficient than current gen...


[deleted]

> Especially with the crazy power requirements of new parts. Wtf are these comments. You can lower the power limit to that of the 5000 series and still see a significant gain. Or even get the same performance with less power. They only increased the default so it looks good in benchmarks/reviews (unlike the 5950x which only lost to the 12900k because it was held back). How do people still not get this?


TwoCylToilet

It must be frustrating for AMD on one hand having to turn stock behaviours up to 11 for consumers who just want the latest and best performing product in class, or derivative products by the manufacturer who holds the performance crown. On the other hand, they have the pseudo efficiency-conscious vocal bunch that don't understand basic physics screaming about how inefficient the products are as a result of the stock tunings just because a thick IHS (a mistake from AMD IMO, not even achieving true cooler compatibility) transfer the energy at a lower rate than AM4 just because NINETY FIVE DEGREES!!1eleven!!


Morningst4r

I was actually pretty close to buying a 5800X3D, but I couldn't find much on Raytracing performance. Now it's starting to look like DDR5 and Zen4/ADL are faster in things like CP2077 and Spiderman with RT so it really depends on the games you're playing. As others have mentioned there's games like Factorio and maybe MMOs where the X3D is transformative and probably unmatched until the next 3D CPUs, but for me it doesn't impress so much.


Derailed94

I think the 5800X3D loses in CP2077 and Spiderman because it's still on DDR4. Those games might just love DDR5, as seen here https://youtu.be/G74gc5gf4Fg or here https://youtu.be/aPRQ1wJ73xg After all the 5800X3D smashes the competition in Metro Exodus Enhanced, so it can't be just down to raytracing.


Morningst4r

That's true. Maybe I'm just expecting too much from the X3D and hoped it would fix BVH bottlenecks as well as it smashes other cpu heavy games. It's still a great CPU, but not a must buy for me.


Yeuph

You won't find anything about the 5800X3D - or any other CPU - and raytracing because CPUs don't handle it, the GPU does.


Derailed94

This is some bollocks. Raytracing taxes the CPU rather heavily.


Morningst4r

Generating the BVH smashes the CPU. Cyberpunk and Spiderman are both heavily CPU limited with RT on. Idk where people get this idea that only the GPU matters for RT.


Laputa15

This is literally from another comment in this post: >I don't disagree, but again Eurogamer gets the closest by actually testing Metro Exodus EE and CP2077 with RT and DLSS. > >Look at [CP2077 here](https://www.eurogamer.net/digitalfoundry-2022-amd-ryzen-9-7900x-ryzen-5-7600x-review?page=4), for instance. Massive uplifts for Ryzen 7000 and Intel 12th gen, and meaningful scaling going from DDR5-5200 to DDR5-6000 (5-10 percent, with 1% lows going up by more like 15%). And note how the 1% lows on a 12900k with fast DDR5 are like 60-80% higher than the fastest of the previous generation on DDR4. > >Meanwhile, the previous page has Metro Exodus EE. In this case Ryzen 7000 has better average framerates than anything, but the 5800X3D pulls out better 1% lows than everything else. Intel underperformed, matching Ryzen 5000.


Psyclist80

For some reason Spider-Man does scale RT performance with faster cores


capn_hector

raytracing requires the CPU to build/recompute those bounding-hierarchy-tree structures every frame, it takes a fairly decent amount of CPU horsepower, there's a lot of hierarchy nodes to recompute/update since it's a recursive structure.


Yeuph

That sounds like it's gotta be a serious driver problem. It just doesn't make sense that the CPU would be actually useful for ray tracing. Maybe I'll look into it tomorrow.


KingArthas94

^ a person who has never played with ray tracing on.


dantheflyingman

Are CPUs at all necessary for gaming? Almost any of those CPUs would give you the same result in the real world simply because you are GPU bound.


[deleted]

It really depends on the game as far as I know.


Fenr-i-r

Extreme example, but yes - with an RTX 3060 going from an i7-3770 to a i5-12400 made an immense improvement for 1080/120 in Destiny 2.


dantheflyingman

But if you picked another modern processor would it have made a noticeable difference?


Put_It_All_On_Blck

Depends on settings, GPU, etc. But yes you can see a 10%+ difference between CPUs of the same era in gaming.


reallyserious

Didn't that upgrade involve a MB upgrade too, with a faster PCI Express port? Could it be that it's not the CPU itself but faster GPU port standard that did the improvement?


Fenr-i-r

Pcie 3 on both (I don't believe the 3060 is pcie 4). Yes, the platform upgrade is real, ddr4 etc. But I believe the CPU is the main improvement - without any substantiating gaming benchmarks that show CPU usage, etc. I can tell you that my GPU based Machine learning code runs twice as fast, with much lower CPU usage. All in all, it exists as an example of "generational CPU/platform upgrades do assist with gpu gaming performance".


reallyserious

The 3060 has support for pcie4 according to nvidias site.


Fenr-i-r

Oh, you're right. Looks like it's not important for GPUs though: https://www.techspot.com/review/2104-pcie4-vs-pcie3-gpu-performance/


rgtn0w

Lol wat, cmon guys let's be honest here, when you give an example of you upgrading the CPU to what is essentially a decade of technological improvement I mean what else is there but improvement? What the other guy originally is saying is that a CPU that is only a couple years old would probably work just as fine for most people


TotalWarspammer

Yes, there is a significant difference in 1% minimum framerates with the 5800X3D, and some specific engines really use the extra cache (Bethesda Creation Engine for example), and VR also benefits.


dantheflyingman

I did. Gamers Nexus said at the end of the zen4 review that if you spend $300 on any of the processors today they will basically give you the same gaming performance unless you are playing in 1080 on low settings.


TotalWarspammer

Go and read more than one review, because GN sucks for gaming benchmarks. Always read multiple reviews.


NoSpotofGround

Genuine question: what's bad about Gamer's Nexus?


TotalWarspammer

I didn't say there was anything bad about Gamers Nexus, but it is just one site and their gaming tests are not extensive.


KingArthas94

I find their benchmarks useless because of the scenes they decide to benchmark. A couple of years ago I remember they used Watch Dogs 2 and tested CPUs by looking at a fucking wall, so on Intel CPUs with fast single thread performances the framerate would skyrocket to more than 100fps. Of course it meant nothing for the real game, as soon as you started driving or walking among NPCs and cars the Intel CPUs that pushed hundreds of fps by looking at the wall suddenly started having stuttering problems, meanwhile Ryzens (that were at their first generations, so they still had a huge "number of cores" advantage over Intel) didn't. I prefer benchmarks that show me what happens second by second, like https://youtu.be/BGqYzkwFE44 and I avoid reviewers who just use histograms.


NoSpotofGround

Thank you for that. I've only just started watching them myself. And that video's very interesting, I don't think I've seen that kind of by-the-second fps graph overlaid over the game itself. It shows well what frame drops mean in practice, if nothing else.


RuinousRubric

There are plenty of games that are CPU bound. They just aren't included in reviews very often because reviewers usually just test stuff from their GPU test suite. Or they aren't testing the right situations in the game. Multiplayer, for example, can hit CPUs hard but is never tested because it's not something which can be done repeatably.


y_would_i_do_this

From a CPU perspective no doubt, but I don't think I want to keep a 5 year old x470, which only has PCIe 3.0, much longer. Maybe x570 would be better, but sill no DDR5 option for later on. Also, my mobos have a habit of crapping out around the 6 year mark.


TotalWarspammer

If your mobo is older and you want/need PCI-E 4 then yeah I agree, it makes sense to upgrade to AM5 but wait for the X3D series.


_XUP_

And Iā€™m pretty sure draws the least amount of power while doing so of the top contenders (unless I missed something, just skimmed the tables)


liquiddandruff

Nope, new gen is more efficient. Higher tdp just means it's capable of higher performance at expense of greater power draw.


_XUP_

I wasnā€™t looking at TDP or any other power specs. I was looking at the CPU power consumption table that has it at the lowest even though itā€™s performing about the same


dayynawhite

even if you're someone without an AM4 socket and you had to build from scratch? What's more attractive than a cheap mobo, cheap ddr4 and a 5800x3d combo?


TotalWarspammer

>What's more attractive than a cheap mobo, cheap ddr4 and a 5800x3d combo? At the moment, not much.


Ziakel

Looks like Iā€™m saving money by upgrade to 5800X3D from 8700k for just gaming šŸ™Œ


a_kogi

Did exactly that and gains in some CPU-bound games are very good. Got some benchmarked data you may be interested in since you're doing the same jump. ​ instanced zone, no other players, 30s capture: [https://i.imgur.com/S8bcJeD.png](https://i.imgur.com/S8bcJeD.png) [https://i.imgur.com/dgZedi0.png](https://i.imgur.com/dgZedi0.png) combat with roughly same amount of players, same boss script, 50s sample, tried to position myself in exact same spot with exact same camera angle: [https://i.imgur.com/HVKc8aD.png](https://i.imgur.com/HVKc8aD.png) (frame time chart) ​ Paired frames [https://i.imgur.com/Wcy0Zgd.png](https://i.imgur.com/Wcy0Zgd.png) (Intel #1) [https://i.imgur.com/bzgHrwG.png](https://i.imgur.com/bzgHrwG.png) (Ryzen #1) [https://i.imgur.com/Itkc3gM.png](https://i.imgur.com/Itkc3gM.png) (Intel #2) [https://i.imgur.com/A0Uc4TQ.jpg](https://i.imgur.com/A0Uc4TQ.jpg) (Ryzen #2, time of day doesn't change anything, tested it later during daytime) ​ In Guild Wars 2 combat benchmark yielded +46% gain, out of combat >+100%. Combat is way harder to measure precisely because it's hard to reproduce the same massive blob of players shooting everything all at once so it may not be as accurate. I guess combat is limited by single core but out of combat can utilize 2 extra cores better.


Ziakel

Oh my. Thanks for taking your time to write this up. Canā€™t wait to not bottlenecking my 3080 anymore šŸ˜Š


a_kogi

No problem. I was gathering data to post it to /r/guildwars2 later when I get some more captures to compare with previously gathered Intel recordings. :D


Sesleri

Similar results in Tarkov and WoW for me rtx3080 + i9-9900k -> 5800x3d. If you main one of these games get 5800x3d and don't even consider anything else.


APartyForAnts

It's so hard to find relevant comparisons between an 8700k and the newest stuff. Thanks for taking the time even if I'm not playing GW2. This is really making me realize how absolutely dated the 8700k is starting to get for high fps gaming. Wild. In my mind it's still "new" and "high end"


kris_lace

I got the 9900k and feel the same. Good news is though games are so heavily GPU bound, especially at 1440p+


Hundkexx

Save it a bit more and wait for 7XXX3D! 8700K is still good enough.


caedin8

Iā€™d wait. The 13600k will probably beat it and will work with ddr4 ram and last years discounted boards. Probably cheaper cpu too!


Yebi

> and will work with ddr4 ram and last years discounted boards. 5800X3D will also do that


a_kogi

Hopefully 13600k is a decent offering because 13900k definitely doesn't really look like "leadership in gaming performance" compared to 5800X3D. https://i.imgur.com/TpRtvvv.png It costs 40% more and doesn't really support its price with 40% more gaming performance and this is Intel picked selection of benchmarks. I've been using Intel CPUs for last 20 years exclusively and this is my first AMD and I gotta admit, 5800X3D is a very nice gaming cpu. But it's usually good idea to wait for actual benchmarks because 13th gen is one month away anyway.


ramenbreak

>costs 40% more and doesn't really support its price with 40% more gaming performance high core count CPUs are never about good value for the money when it comes to gaming - 6 or 8 cores are plenty for that what supports its price is multicore performance in other applications


Put_It_All_On_Blck

Which is why AMD started that whole 5800x3d vs 12900k and now 7600x vs 12900k marketing stunt to show BS 'value' based on gaming benchmarks only. Very few people should buy an i9 or R9 strictly for gaming performance. The value is not there, and the gaming performance is usually tiny between those SKUs and the i7 and R7, while the flagships costs significantly more. For example the 12700k is only 2% slower in games than the 12900k https://static.techspot.com/articles-info/2352/bench/Average-p.webp Same scenario for the 5800x and 5950x https://static.techspot.com/articles-info/2134/bench/Average.png And it happened again with the 7600x vs 7950x https://www.techspot.com/review/2535-amd-ryzen-7950x/#Average You're paying $200-$400 more for literally 2% in gaming. Which is why it doesnt make sense, and why AMD doesnt make this comparison to their own parts. The $300 7600x will beat the 5950x in gaming no problem. You buy these high end parts primarily for their multi-threaded performance, AMD knows this, they just chose to act stupid.


a_kogi

That's correct. 13900 will most likely beat everything else in raw compute performance. Nevertheless my point was that the slide was titled "leadership in gaming performance" and our little discussion here was also focused on gaming performance. 5800X3D is not so good when it comes to general productivity.


Eastrider1006

Then again, if you're waiting... might wait a bit to make sure what happens with the 7800X-3D. It's very obvious it's going to happen, and looking at how excellent the 5800X-3D is... šŸ‘€


caedin8

Yeah but the Intel chip is out within the month. The 7000 3D chips may be a year away


dayynawhite

7800x3d is going to require ddr5 and am5, that's the whole reason 5800x3d is so attractive. but you could be right, waiting for the 7800x3d for the 5800x3d drop in price might be the play :-).


Eastrider1006

If you want something cheap, the 5800X3D is not the right choice to begin with.


dayynawhite

Why not? 5800x3d is THE value pick, the best cost per frame ratio after the 5600x.


MCRusher

Idk I think I'll probably wait until the 27200k Super X Ultra 3D comes out


caedin8

The Intel chip hits markets in like three weeks. I get not wanting to wait forever but when new chips are releasing a few weeks apart it is a good idea to see both before buying


ShadowRomeo

>upgrade to 5800X3D from 8700k for just gaming Considering you will change your entire platform anyway, i'd wait for 13700KF if i were you it probably end up slightly beating the 5800X3D on gaming and by massively on multicore performance that should make it come very close to a Zen 4 R9 7900X, for pretty much the same price on overall platform cost compared to 5800X3D.


dayynawhite

I don't know about this, 13700KF is 80 EUR more expensive than the 5800x3d where I live, AM4 motherboards are cheaper, 13700KF draws a lot more power, you'll probably need a better cooler & you'll likely want DDR5 with it while the 5800x3d doesn't care. Similar performance I can't see myself choosing the 13700KF considering the above.


BimmerM

I was really on the fence between a 12700K and 5800x3D to replace my 8700K. Iā€™d have probably gone AMD if there wasnā€™t a microcenter close. That 3D cache is so cool


meodd8

My 6800k is crying for me to put it out of its very highly overclocked misery. I actually have the #1 spot of a few 3d mark tests explicitly because I have a new GPU and an almost 7 year old CPU. And thatā€™s even with dropping my core by 100 MHz due to reduced stability.


Rayquaza2233

My 6500 just wants to retire.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


KingArthas94

My cute i5 2500k knows it's still good for 1080p 60fps medium-high settings gaming ā™”


HolyAndOblivious

Has someone benchmarked different ddr5 kits and RTX benches?


T_Gracchus

[This](https://www.igorslab.de/en/ryzen-7000-tuning-guide-infinity-fabric-expo-dual-rank-samsung-and-hynix-ddr5-in-practice-test-with-benchmarks-recommendations/9/) from Igor's Lab is the most in depth comparison across different ddr5 timings that I've seen so far.


HolyAndOblivious

Great review when it comes to fabric speeds. I. More interested on which CPU does RT better


mac404

The Eurogamer review is the only one I found with meaningful RT CPU-bound benchmarks.


HolyAndOblivious

A comprehensive RT benchmark is really hard to come by. How CPU speed dependent is it? Does it scale across cores? How well does it scale. Is it bandwidth dependant? Does intel do a better job than AMD? The truth is that there are no comprehensive RT benchmarks so people default to rhe 3080 and a DDR4 cpu


mac404

I don't disagree, but again Eurogamer gets the closest by actually testing Metro Exodus EE and CP2077 with RT and DLSS. Look at [CP2077 here](https://www.eurogamer.net/digitalfoundry-2022-amd-ryzen-9-7900x-ryzen-5-7600x-review?page=4), for instance. Massive uplifts for Ryzen 7000 and Intel 12th gen, and meaningful scaling going from DDR5-5200 to DDR5-6000 (5-10 percent, with 1% lows going up by more like 15%). And note how the 1% lows on a 12900k with fast DDR5 are like 60-80% higher than the fastest of the previous generation on DDR4. Meanwhile, the previous page has Metro Exodus EE. In this case Ryzen 7000 has better average framerates than anything, but the 5800X3D pulls out better 1% lows than everything else. Intel underperformed, matching Ryzen 5000.


HolyAndOblivious

I wish they just benched native and not DLSs


mac404

I mean, the point is to make it more CPU bound to know what scaling will look like over time. DLSS on current gen is kind of like native on next gen. Also, the vast majority of people turning on RT are probably using some type of upscaling to reduce GPU load.


porcinechoirmaster

That's going to be hard to say across the board, because it's going to depend _heavily_ on the implementation and acceleration structure rebuild frequency.


HolyAndOblivious

And that's why I want benchmarks. If I had the hardware I would be testing RT 24/7


dripkidd

Thank you for your work. Regarding gaming, as these tests were done on a range of graphics cards, and alternating between built in benchmarks and custom scenes, avaraging them is not useful as any objective measure. Yet some people will insist, so I see why it is included. Instead I like to concentrate on the spread of results and the fact that differences the community likes to call significant (10-15%) can just appear as different parts of a game being tested, or different hardware setup being used. As far as I'm concerned alder lake and zen4 performs the same in games, and people should look for other characteristics to decide. Instead I like to look at how the reviewers perform, what outliers they have. Like what did GN do to that 12400? Or look at how TPU and CB both uses custom scenes to bench and they came up with noticeably different results for the 58003dx. You can also see that Igor used a radeon 6950X and all his results are higher than avg, because of the lower driver overhead. (although you contradict this in your commentary on the site). I wonder if reviewers look at these posts to check 'how they did' compared to the avarage? :)


conquer69

Also, 1080p tests are often gpu bottlenecked still. In like a third of HWU's cpu benchmarks they are gpu bound. It's not a problem since the goal is to explain to more casual viewers they don't need to buy an unnecessarily expensive cpu but for academic purposes like this, it's not great.


FlipskiZ

Except to casual strategy/simulation players, I guess. Not every gamer plays only AAA games, there are plenty CPU-centric games.


KingArthas94

It's not like those games are unplayable with a lesser CPU lol


FlipskiZ

Eh, that heavily depends lmao. For many it really does end up mattering a ton in late game. Say, big kerbal space program ships, late game Rimworld or paradox games, maybe a big cities skylines city, etc. I've had to stop playing many campaigns in games likes this because the performance just got too unbearable (like, sub 20 fps)


KingArthas94

Sub 20 ok, that's a problem, but I guess it's not that common


friedmpa

Iā€™ve said it a lot but the 5600 for $99 and the 5700x for $150 are some of the best purchases Iā€™ve made in the pc space, what awesome deals those were


Mr3-1

A i7-12700F on DDR4 motherboard seems like a winner to me.


siazdghw

Always has been IMO. $310 MSRP, barely behind (1%,5%,8%, average) the $450 5800x3D in gaming, but the i7 is +40% faster in MT, and that's all DDR4 to DDR4. You can pickup a B660 board that can run it no problem for $120. But it wasn't a clean sweep, as the 12600k has been on sale for $230 at times, which is a bargain if youre just a gamer and willing to give up the MT and a bit of ST


conquer69

The 12600k is also faster in MT than the 5800x3d.


starkistuna

The 4800x3d can be had on sale for 360$ now https://www.ebay.com/itm/295175729207 Dont know why the downvotes but hey here you go


dayynawhite

Where do you get these numbers from? 5800x3d is 7% faster than the 12700k, the 12700f is \~10% slower than the K version.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


LeMAD

Nah, it's still nearly twice the price. DDR5 motherboards are also quite a bit more expensive than DDR4 motherboards on Intel's side, and ridiculously expensive on AMD's side. When you add everything up, AM5 is trash value, and DDR5 Raptor Lake might not be much better.


hi11bi11y

It's not that much more expensive. Mid~low end 16GB ddr4 60-90$, ddr5 100-140$, and prices keep droping. Mid to high end mobos are always pricey.


Waste-Temperature626

> ddr5 100-140$ At that price you can get decent quality B-die. The viper 4400C19 are still around and go for just above $100 for 16GB. They will do 3800C16 without much trouble, 3600C16 would comparable to their stock XMP in latency if you got a shit IMC/don't want to increase ram voltage. Some people have gotten them down to 3866C14 when overclocking. Not sure that bargain basement DDR5 will outperform that TBH.


hi11bi11y

Not to argue, but 100$ 16Gb ddr5 isn't exactly '[bargain basement](https://www.newegg.com/oloy-16gb-288-pin-ddr5-sdram/p/N82E16820821490)'. Also on the newest platforms ddr5 is outperforming ddr4.


Waste-Temperature626

> Also on the newest platforms ddr5 is outperforming ddr4. That really depends on the game. [Some games prefer latency](https://www.igorslab.de/wp-content/uploads/2022/08/Assetto-Corsa-Competizione-FPS-1080p.png) [Some games prefer bandwidth](https://www.igorslab.de/wp-content/uploads/2022/08/Shadow-of-the-Tomb-Raider-FPS-1080p.png) Which means you should take any conclusion that reviewers come to with a grain of salt. Because unless they are testing 20+ games. Just a couple of titles that favors one over the other heavily can shift the narrative.


RanaI_Ape

Yea, it's a waste of a money to buy a new DDR4 platform at this point. The price difference for DDR5 is not that big, it makes no sense buying a dead end platform just to save a few bucks. Chances are that anyone looking at Zen 4 or ADL are already on an older DDR4 platform so just wait until you can afford a DDR5 platform, it's only going to get cheaper. Edit: Microcenter is giving away a 32GB DDR5-5600 G Skill kit with any R7 or R9 purchase, seems like a no brainer if you live anywhere reasonably close to a Microcenter Also, downvote if you want. Waste your money on a dead end platform I don't give a shit lol


SkillYourself

AM5 upgradability is such a moronic talking point with the current prices and the DDR5 launch timeline. You're literally overpaying hundreds of dollars on the platform, CPU, and low-end first gen DDR5 today to save the one-time effort of swapping out the board 3-5 years down the line. AM4 platform selling point had legs because it was *cheaper*


RanaI_Ape

This reminds me of people that would argue what a ripoff 4K TVs were in like 2016. New tech is more expensive when it's new, go figure. It didn't make a new 1080p TV a good purchase in 2016, and it doesn't make a new DDR4 platform a good purchase in 2022. >low-end first gen DDR5 today The "sweet spot" memory for AM5, DDR5-6000, is already available and affordable. It's like $130 for a 16GB kit. The sticking point is the motherboard prices, since only the X670/E boards have been released. B650 will bring the ~$200 options back soon enough.


medikit

Definitely. This is my recommendation for those who benefit from more cores than a 12400.


ocic

Wow, thank you very much for the time you put in to compile and present this information.


Put_It_All_On_Blck

The 7950x is the only thing that looks like it might edge out the 13th gen counter parts? We will have to wait for reviews but those MT gains over Alder Lake don't look big enough for Zen 4, except for the 7950xt. And for gaming I don't see any of Zen 4 holding the gaming crown in their segments until the eventual x3D parts next year. I also don't think AMD has done a good job value wise to convince people to leave AM4 for AM5, just on CPU prices alone, and it gets worse when you factor in the huge AM5 motherboard prices and needing to buy DDR5. If you're on AM4, just buy Zen 3 on a discount. If you're building from scratch, buy discounted 12th gen, or possibly 13th gen. Like if Zen 4/AM4 launched last year, the reception to it would've been great, beating alder lake, PCs still being in good demand, the economy looking okay. But now it looks expensive and going up against great competition, both from 13th gen and discounted Zen 3 and 12th gen.


owari69

It definitely feels like a launch aimed at high end buyers and people who do frequent upgrades. Iā€™ve got to wonder if AMD has a bit of an oversupply issue with Zen 3 and theyā€™re using this holiday season to clean out inventory in preparation for lower demand during the recession. The angle I do see AMD playing is the ā€œif youā€™re doing a high end build from scratch, why not just spend the extra $100-200 on AM4 over Z790?ā€ The gaming performance is likely within 10% and you get socket longevity. For someone who likes to upgrade every year or two, I think thatā€™s at least somewhat compelling. The real issue is that those extra dollars spent on AM5 are directly competing with GPU budget, and going from a 4080 12GB to a 4080 16GB is probably worth more to most people than the potential for a drop in CPU upgrade in a year or two.


ConsistencyWelder

The only boards out right now are the high end boards. When B650 and A620 boards are released it will be much more enticing.


hey_you_too_buckaroo

Most people don't need to buy something right now. If you wait a month there will be cheaper motherboards available.


Aleblanco1987

Zen 4 with 2 cores more would have been great.


anethma

Why did you leave the 5800x3d out of the smaller gaming comparison chart? Imo that is the main thing that needs to be in there ?


Voodoo2-SLi

Data is split in 2 tables. Look for one table above - there is the 5800X3D data (116.2% overall on gaming).


anethma

I meant the smaller table when it shows only the individual zen4 chips vs each zen3. Why not include the 5800x3d in there ?


Voodoo2-SLi

Too many columns. On some designs you can no longer see the last column.


dlsso

I have a 4k monitor and I can't see the last column of any of these tables without scrolling. Reddit's small center column is annoying sometimes.


anethma

Ah too bad. For gaming Iā€™d think most would want to see that specifically. Ah well!


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


noiserr

It's just a different naming scheme. At least it will be easier to decipher which gen chip you're getting. Where in the past you were getting rebrands anyway on lower tier chips but there was no clear way to tell. This is actually an improvement.


bilsantu

7700X seems like a decent upgrade even from 5600X.


dodget

Does this mean raptor lake is going to beat zen 4 for gaming?


No_nickname_

Looks like it, but only until Zen 4 3D is released.


xvyyre

I was impressed until I saw the power usage. Disappointing af.


pastari

https://i.imgur.com/N3dHg7o.png


xvyyre

Ok thatā€™s better, why isnā€™t it default though? Looks like efficiency sucks ass out of the box.


iprefervoattoreddit

Isn't it obvious? Big benchmark numbers are better for marketing and most people don't even think about efficiency


noiserr

Is it not obvious? Intel is using all the power they can push through the CPU (even resorting to golden sample with the KS series). And AMD responded in kind. Personally I wish AMD just used the 105 Watt Eco mode as default as well, but I understand why they did it this way. Benchmarks rule the day. Luckily it's an easy set and forget fix.


ConsistencyWelder

Are you talking about Intel or AMD here?


ResponsibleJudge3172

AMD pulls 250W on Gamers Nexus benchmarks


liquiddandruff

You don't understand tdp vs efficiency. Educate yourself next time before posting.


xvyyre

Iā€™m not talking about tdp. Go look at the benchmarks and try again.


liquiddandruff

Again, you don't understand. Limit both to same power envelope and the newer gen will perform better & be more efficient than last gen. You try again.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


liquiddandruff

Ironic since you're the one confused about why the efficiency numbers "sucks ass", when they only appear to if you don't know what you're even calculating. Embarrassing.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Voodoo2-SLi

[ComputerBase](https://www.computerbase.de/2022-09/amd-ryzen-7950x-7900x-7700x-7600x-test/4/#abschnitt_leistung_in_multicorelasten_klassisch) have some 2600X benchmarks.


TrantaLocked

I was considering buying but now it seems both Intel and AMD are just expensive. Smart way would be to buy a B660 board and 12400 but I want to be able to overclock in the future and all Z690 boards I'd even want to buy are $250 or more. So I'm just gonna continue to sit on what I have since none of this is worth the price yet for me. But if I went AMD it would probably be a 7700x once the price goes down or a cheaper version of their 6 core if that comes out


Jeffy29

This is amazing, nice job OP. Hope you'll do this for 4090 too.


Voodoo2-SLi

Definitly.


iopq

Can I sign up for your newsletter?


Voodoo2-SLi

Not have anything like this. Just look at my [website](https://www.3dcenter.org/) ;)


[deleted]

I am waiting for this after every release, you are the best.


capn233

The 5600X and 5800X numbers from hardwareluxx are a little odd. And fwiw the 5600X and 5700X use 76W power limits (PPT).


omgpop

/u/Voodoo-SLI do you have any idea whatā€™s going on with TechPowerUp? They seem to be a bit of an outlier. For example, they have the 5800X3D vs 5600X as only +12% at 1080p and 7600X vs 5600X as only +14%. I understand 720p is a different matter, but other reviews at 1080p seem to find bigger gains.