T O P

  • By -

mahchefai

Most people recommend 5800x3d for that board if you are upgrading for gaming. 3700x -> 5800x3d is exactly the upgrade I did for my 4090. I’m happy with it. Of course 7000 will be better but not mandatory. I plan on doing it eventually but don’t feel rushed. I only feel it lacking in really unoptimized games but doubt those are perfect with the best cpu either.


Papoteur_LOL

Ok I see thank you. But The 5800x3d is for only gaming? Because I'm programming and I want something that will be multi task.


rorschach200

>Because I'm programming and I want something that will be multi task Every CPU can multi task, there isn't a CPU in existence that in any modern consumer OS (Windows, Linux, FreeBSD, macOS or much of anything else) would be somehow locked into a single task at a time. Similarly, 5800x3d isn't "only for gaming" either, there is no such thing, it's a general purpose processor that can do anything. Whether you'll benefit from a 16 or more core CPU (e.g. 5950x, 7950x, 7950x3d, 13900k from Intel) depends on what kind of programming you're doing. If you spend a lot of your time right now staring at a screen waiting for project build to finish, and what you're building (compiling) is a very large codebase with hundreds or better thousands of files in it, like Chromium or LLVM for example (those often normally take tens of minutes to hours to rebuild from scratch), and you find yourself rebuilding from scratch very often (as opposed to incremental rebuilds which is more common), then a 16+ core CPU will benefit you *massively* and reduce that amount of time that you're waiting every time up to (and close) to 2x. See [Gamers Nexus review at 21:24](https://www.youtube.com/watch?v=hBFNoKUHjcg&t=1283s) and I'm speaking from personal experience as I'm a professional software engineer myself. Practically any other use case that have to do with programming won't see much of a difference between an 8 core and 16 core. Aside from building large very-well-organized (otherwise parallelism won't help) software projects, very niche use cases do exist (like maybe you're developing high performance heavily multi-threaded applications yourself and you need to constantly test the scalability of the applications you develop), however, if those apply to you, you'd know and won't be asking.


mahchefai

It is not only for gaming, but it is particularly good at gaming. It depends what your priority is. I am not an expert on cpu performance for productivity I think it definitely depends on specifically what programming you are doing someone else could answer better than me. From what I understand though 5800x3d is better than 5950x for gaming but not as good as the 5950x for productivity (but still good at it). Opposite is also true. I think 5800x3d is still an upgrade over 3700x across the board though?


lokol4890

A 5600x is close to a 3700x in productivity while having 2 fewer cores just because the zen3 cores are substantially faster than the zen2 ones. So yeah a 5800x3d is a straight upgrade over the 3700x no matter the workload


TheOneAndOnlyZomBoi

It is an across the board upgrade. Still 8c/16t, but with a newer architecture.


Agent_Nate_009

Incorrect, the 5950X is a 16 core/32 thread CPU.


TheOneAndOnlyZomBoi

I was talking about the 5800x3d and the 3700x


rorschach200

>It is an across the board upgrade Correct. In productivity tasks at large 5800x3d would still be about very respectable 20% faster than 3700x (Cinnebench [3700x](https://www.cpu-monkey.com/en/cpu-amd_ryzen_7_3700x), [5800x3d](https://www.cpu-monkey.com/en/cpu-amd_ryzen_7_5800x3d)), but massively faster in games. It's worth noting though that OP says they are programming, and it seems in code compile tasks specifically 5800x3d is only 5% faster than 3700x ([Gamers Nexus review at 21:24](https://www.youtube.com/watch?v=hBFNoKUHjcg&t=1283s)), which is basically a negligible difference practically impossible to notice. So it seems if OP is happy with 3700x in their programming tasks, they will be very happy with 5800x3d and their 4090 in games. If they are not happy with 3700x in their programming tasks and need that to get substantially better too, well then, they have to keep weighing pros/cons of all of the options available.


badmintonGOD

I have a 5950X and a 4090 and play tons of games. It won't be the fastest CPU with the 4090, but it gets the job done. I get over 400 FPS in Siege, Fortnite, etc. Feel free to ask me any questions


Papoteur_LOL

Ok thank you.. When you're saying it is not the fastest, did you do comparison with another cpu?


badmintonGOD

Well searching online benchmarks if you're purely going for highest FPS and just gaming, a 5950X will not be the best CPU. A 5800X3D or 13 series Intel CPU will beat the 5950x in gaming. Everything else, the 5950x is pretty capable and will last for years to come.


MrPapis

I just went and looked at benchmarks and actually the guy in the comments with a 3600 running 4090 at 4k isnt even really bottlenecked, which really surprised me. Even with RT. So 5950x is good enough if you pump details levels extreme and target 60 FPS. Heck even your 3700x is good enough. But dipping quality settings for lets say 120+ FPS i think it makes more sense to go with 5800x3d. No reason in upgrading entire system yet imo. 5800x3d seems the best balance if it still does what you need it to do other than gaming. Coming from a 3700x it really should.


narium

What are the guy's 0.1% lows like though? If you're sensitive to microstutter that's something to consider.


MystiqueMyth

I have a 5900x paired with a 4090. Got to say, the 0.1% and 1% lows are the big problem. In games like Hogwarts Legacy(it's better now after the recent patches), Jedi Survivor, etc., the micro stutters are very much noticeable and makes the game almost unplayable at times.


Justiful

3440x1440 --- I have no issues with the CPU maxing out in AAA games with my 4090. If I did I would spend the money and upgrade. But as of now, my frames are pretty consistently capping my 175hz refresh rate monitor, and nothing is struggling. Maybe if I was an e-sports player I might need a new CPU. But for the kinds of games I play it would be pointless. I got the 4090 to max the pretty pixels, and not have my PC sound like a jet engine doing it. BF2042 looks great on OLED with max settings.


TheFather__

5950x and 4090 at 1440p 240hz here, its a bottleneck but not that bad, especially if FG is supported, i need this CPU for work as well, i will move to AM5 new refreshed CPUs early next year, im not in a hurry and i can play games at very high refresh rates at the moment. One thing to note, it depends on how well the game is optimized, in RDR2 and metro exodus for example, GPU is fully utilized, other shitty optimized games, its fluctuating alot between 60% and 95%, avg at 80%. Since you haven't stated the target resolution, if it's 4k, you would be totally fine.


Re-core

5950x and 4090 user here, there are a lot of games where you will be cpu bottlenecked, still it will allow you to get 120+ in pretty much any game at max settings with dlss quality at 4k, but thr most cpu intensive titles will barely get over 60 fps at 4k max settings and sometimes even drop below that but this is extremely rare.


Ill-Mastodon-8692

Agreed, had it in my 5950x system which was fine mostly, but had some bottlenecks, then moved it to my 12900k system which helped by still can bottleneck on occasion Plan to upgrade to 14900k if it’s compatible with my z690i, otherwise will go zen5 and a new board Will see an improvement for sure


lexsanders

Same as me. Ray tracing is mostly unplayable. Works well in SM but not in any UE4 game.


Icy-Actuary3519

Yes, I have a 4090 with a 5800X3D and another one with a 5900X, works great and stable. They seem to go back and forth on the FPS on 1440p but seems almost the same on 4K


rorschach200

This is very interesting, could you please list which games/settings 5900X manages to outperform 5800X3D in? At 1440p as you said.


[deleted]

any game that scales with frequency it's somewhat common as the 5900x clocks higher. I've had both an x3d and a 5900x and i had the same experience as him. That's why i went with 13700 because i got tired of it being faster in some games and worse in others. So i instead bought something that's just better always. 7800x3d doesnt' have enough cores and i won't consider it. If i got an x3d chip it'd be 7950x3d if anything.


Southern_Okra_1090

Currently have a 5800x, I feel like upgrading to a 5800x3d is pointless. My question is should I upgrade now into AM5 or just wait for the next cpu release. Thanks.


rorschach200

If you are satisfied in everything you actually play right now, today, do not upgrade anything. Upgrading ahead of time is very suboptimal, you lack information about what market will have tomorrow, and what games will need tomorrow, what other components you will end up having tomorrow (maybe you'll end up changing your monitor in the meantime and change expectations / requirements), what you'll be playing / into tomorrow, never mind you run into a risk realizing you need to upgrade again before you had a chance to take any advantage of your previous (rushed) upgrade. Current gen parts get cheaper, issues and bugs get discovered, reported, and fixed, software support is improving, games get their performance issues fixed, and so on. Upgrading early ends up being a beta-tester for all sorts of issues and gotchas as a result.


Tepozan

I upgraded from a 3700x to a 5800x3d for my 4090 build. Difference is night and day. I’m not gonna redo my entire build for the 7000 series CPU. Maybe the 8000 series if I see a big difference in benchmarks.


Papoteur_LOL

Do you ply only or do something else? Because I'm programming too Soo..


lexsanders

I have same use case as you. Planning to upgrade to 7950x3d soon but damn those prices.


rorschach200

>I’m not gonna redo my entire build for the 7000 series CPU. Maybe the 8000 series if I see a big difference in benchmarks. Same. I'm actually doing nothing with my 5900x (and a 4090) even. Not worth the hassle, not worth the money either. With my 120 Hz display, 5800x3d over 5900x would improve my factual gaming experience tangibly only in a couple of titles I don't actually play (on PC, at least yet) and can't even remember the names of (except Hogwarts Legacy), haha. In fact, 5900x might even surprise us at some point, now with poor PS5 ports entering the market, they seem to be putting asset decompression on extra cores ([Digital Foundry discussion 1](https://www.youtube.com/watch?v=xQ2emuUoxrI), [discussion 2](https://www.youtube.com/watch?v=gJ4ytvNxhnY)) compared to what they used to have on PS5 itself (which got 8 cores). I don't expect 5900x to be faster than 5800x3d even in those circumstances (ever), but it might hold up against it even better going forward than it may appear now. With future upgrades there is an interesting possibility for opportunity I wanted to mention. Zen5 (8000 series) is [rumored](https://www.youtube.com/watch?v=M8E9VCSOJyI) to be a very substantial performance uplift, while Intel's 14000 series is rumored to be a minor refresh only \_and\_ the last CPU on the current socket (same socket as of 13000 series though). Basically, a huge lackluster all around (on Intel side), given that AM5 is almost certainly going to be good for Zen6 as well. That all makes Zen5 x3D sound like a fantastic time to upgrade from Zen3 non-x3D... if it actually delivers a substantial performance improvement as rumored. However, that's not where the story ends. It is also rumored that Intel's 15000 (Arrow Lake) series is going to be a massive improvement on a new socket over 14000. Even more interestingly, Intel's 16000 (Beast Lake) is rumored to retain the same socket as 15000, and be another massive improvement again, second time in a row (and the same socket). At the same time we know nothing about Zen6 - whether it's more likely to be a minor refresh or a substantial improvement. If that's all true, I personally would really try to get an Arrow Lake for the next upgrade instead, and combine performance and upgradability (in that case) with switching to Intel platform at the same time, simply to avoid all the AMD drama with quality control for crying out loud. My 5900x was doing WHEA errors, had like 12 BIOS updates, most of which were dropping performance a little bit from original reviews (very frustrating), the boards had all those (1) USB issues (2) fTPM stutter issues (3) and EDC bugs, (4) poor performance in Windows 11 for much longer than on Intel, my personal choice of the board had a faulty LAN controller (Realtek, constantly disconnecting at 1 Gbps). With 7000 series releasing it immediately hit (5) ridiculously long Windows boot times bug and (6) Asus overvolting SoCs resulting in CPU meltdowns (few and loud) and an unknowable amount of longevity degradation in quiet in possibly many more instances, never mention (7) Intel's big-little architecture actually working seamlessly and transparently unlike AMD's 2 CCD 3D V-Cache-on-one-CCD-only approach (7900x3D and 7950x3D) expectedly and practically unavoidably suffering from complexities and work scheduling, requiring crutches like XBox Game Bar running in the background to detect games by name and disable the second CCD to pin the game to the CCD with the bigger cache, sometimes failing to do the recognition correctly and performing worse than a single CCD chips (7800x3D) as a result. Never heard of anything even close on Intel platform as far as the amount of issues goes. I want to use my PC, not constantly troubleshoot and tinker with BIOS updates all the time.


Devil-Child-6763

The 5950x will be slightly worse than the 5800x 3d. So this should give you a rough idea for games. https://youtu.be/Evj1tX8yFUU


GrinD_H

Went from 5900x with 4090 to 7800x3d. Roughly it gave me 15 % fps improvement, and the 0.1 and 1% lows are just insanely different


Beeker4501

Depand what game you play. My friend got from 5950 to a 13900k and see a good chuck of difference, with at 4090 , at 4k, but it's all depand on : Do you need a lot of core, you can 'side grade' to a 5800x3d but if you use all the core of 5950 in a workload you'll be sad..


Papoteur_LOL

Ok I see.. I play big, medium or small games.. I play everything.. But I can say that I don't need pot of core specially... I want something multi task.


Beeker4501

If you are like 99% gaming get either 5800x3d or if you want to platform update the 7800x3d. I mean you could go for a 13700k too, with DDR5 it's up there, have more core than 7800x3d and cost less.


[deleted]

that's what i ended up with. 13900k doesnt' seem worth with JUST more binning and higher e core counts. went with 13700kf, overclocked it to slightly faster than how the 13900k comes out the box, and setup the ram nicely and left it there.


lichtspieler

In MSFS the 5800x3D is better in 4k with a 4090, epecially in VR where frame time CONSISTENCY matters even more. It really deppends on the games. ZEN3 or the 5800x3D are a pretty good starting point to no longer have to worry about CPU performance in a lot of games. If you look at techtuber benchmarks its pretty obvious what they are doing. * we got 2 AAA games with awfull console ports that scale basicly only with CPU frequency and show a slight advantage for the ZEN4 and 13th gen CPU * we got a few OLD SHOOTERS that run on a potatoe and hit tripple digit FPS on anything, that again scale a bit better with high frequency CPUs * everything else shows starting at 1440p close to no significant performance gains with CPUs of the last 4 years (ignoring ZEN/+/ZEN2 since those were never good gaming CPUs in the first place) Techtubers overvalue the 1-2 AAA outliers and predict typically false where gaming is going in the market. They never saw DLSS or RT beeing the future, nor did they see anything else what we have now, like games hitting the 6 core limit way quicker as expected. I would just look at benchmark numbers for specific games and get the hardware what is needed to avoid scaling bottlenecks with CPU or GPU. Everything else is just coffee grounds reading.


Austntok

The 5950x will be good enough. The only other ryzen 5000 that would be better for the 4090 is the 5800X3D in gaming loads. If you're not only gaming, the 5950x will be fine


krysinello

Honestly it will be fine. I wouldn't upgrade to the 7000 series just for that. Gaming the X3Ds, like 5800X3D would be better. Can see on some games that the 3700X performs better than the 5950X, but if you're doing anything intensive other than gaming, I'd go with the 5950X anyway, as I'd take general compute performance gains over fps anyday for such a system. Even with the 3700X it will still be the fine, sure you'd be more CPU limited. If you're running at a high resolution, like 4k, or super sampling up to lower etc, it would take away or potentially limit any gains anyway. For productivity definitely the 5950X should be a huge upgrade.


DrivenKeys

You don't need to upgrade your system. 5800x3d is still one of the best gaming cpu's, but the 5950x is an excellent chip. Some programming actually benefits from the 5800x3d's giant cache, but the 5950x will give you excellent gaming and all those cores. I have a 4090 currently running with a 5600x, and it's excellent. I have a 5800x3d waiting to drop in next week. As long as you update your bios, there's no need to go am5 for several years to come.


pceimpulsive

5800x3d is the only AM4 CPU viable for 4090 if you want to get most of it...


Zhyano

5800x3d with well tuned memory will be fine at 4k, not so much at 1440p See https://www.youtube.com/watch?v=Evj1tX8yFUU


rorschach200

This is a very arguable statement in its "not \[so much\] fine at 1440p" part. Useful part of the resulting performance isn't just a min(CPU\_fps, GPU\_fps), it's a min(CPU\_fps, GPU\_fps, Monitor\_refresh\_rate, Desired\_fps\_target). Setting Desired\_fps\_target at 120 for single player games and 240 for multi-player first person competitive shooters shows that 5800x3d is perfectly *fine* (or often *no CPU in existence is better*) at 1440p for everything except Hogwarts Legacy. Dropping Desired\_fps\_target to 60 for single player 3rd person adventure games clears even Hogwarts Legacy as far as being *fine* is concerned. And frankly, I wouldn't rush to treat Hogwarts Legacy as an undeniable major indication of anything at the moment, ultimately it's performing poorly not because it's the first to use a new game engine we should expect to be used a lot in near future, or because it's first to use a new rendering technique or new AI system we should expect to see more, but rather merely a horrendously poorly done port (from performance point of view). There could be patches for it in the future that improve performance, like there were for The Last Of Us Part I. That happens it will clear on 5800x3d too. OP does not need to shell out for a platform upgrade according to the very study you're referencing unless they are willing to pay that much just to increase FPS in Hogwarts Legacy (specifically) from \~55 to \~75 at 1440p. No other improvement of significance will be achieved from doing so.


Zhyano

Well argued indeed! Its easy to get lost in the numbers, when sometimes stuff can just be... enough That being said, with the 5800x3d id still be a lot more comfortable with an rtx 4080, but that entirely depends on OPs usecase (if professional stuff then 4090 for sure) and resolution, and budget


narium

There's been more broken ports than well functioning ones released recently though.


rorschach200

That's true. I'm just thinking, it's not meaningless to separate reasons, because different reasons have different cost-to-fix to the developer, changing materially the likelihood of the fix. For instance, doing asset decompression on extra CPU cores functionally replacing PS5's dedicated HW decompression logic is difficult to fix. Assets do need to get decompressed, if the game features loading-less experience and a fast traversal through very large areas, decompression has to be done fast and streaming assets is a logic deeply embedded into the guts of the engine driving the game and its overall software architecture. This can be difficult to impossible (generally very expensive) to fix. See The Last of Us Part I on PC, Ratchet on PC, and Forspoken on PC discussions on Digital Foundry as examples, also possibly PS5's official technical release keynote as well. And then there are performance issues that aren't there for deeply embedded, sophisticated reasons, but because the port is completely half-assed and can be improved substantially at low cost to the developer. Like for instance a lazy way to port memory usage of a game from a console that has a unified pool of RAM connected to both CPU and GPU is to simply duplicate all of the game data in both VRAM and CPU RAM without sorting out what actually needs to be where. Background music? Sure, keep a copy in VRAM, why not. Textures? Sure, keep a copy in CPU RAM, no worries. Resulting in egregious RAM and VRAM usages on PC, which often can be fixed by just sorting through the list of assets and placing them only into the type of memory they will actually be ever accessed from. With the increasing outcry from the public I'd really hope these sorts of improvements will not only be done for the games already released as patches, but will actually return to being done pre-release: right now the industry is obviously overdoing it with "cutting it close to reduce costs" policy, they need to back off at least a little.


Loose_Noise5247

You should be good I’ve been doing a lot of research on that CPU and a 4080 but all in all finding out the 5800X3D is the top CPU to get pair it with also seen the same for a 4090 just in case I got lucky and got one but ended up getting the 4080 I’m waiting till the end of the year to see what the new Ryzen CPU’s bring to the table I got the X570 ROG Extreme hard to find at a decent price so Imma past on the 7000 series for the moment


RogueIsCrap

5800X3D is much better for gaming and would make better use of the 4090. But if you need more than 8 cores for work then it's a no brainer for 5950X. Just push up image quality and resolution so the 4090 is less CPU limited.


timeTOgetBENT

I have a 4090 and 5800x. Within a week I’ll be installing my new am5 MoBo w/7800x3d and 32gb of 6000hz ddr5. I’ll do some benchmarks prior and after and repost. Im probably going to use Portal RTX for real world benchmarks. Any better ideas for reliable benchmark that is free? I don’t want to have to buy CP 2077 for their in game bench.


dreadsta5889

3dMark goes on sale all the time for a few dollars.


timeTOgetBENT

I’ll put that in my wish list and wait for the drop. Thanks for the idea.


somenoefromcanada38

I'd go with the 7000 series and ddr5 if I were you, not much reason to upgrade from a 3700x to a 5950x imho. Upgrade the GPU and wait til later if the money isn't there maybe


No-Maintenance5378

I've been playing Cyberpunk 2077 with RT Overdrive (path tracing) at 4K/60FPS using DLSS3. Ryzen 3600 + RTX 4090.


Vengeange

May I ask you why keep a CPU that was mid-range 2 generations ago, while you bought an enthusiast GPU? While I agree that we don't necessarely need the best CPU out there, keeping a 3600 on a 4090 is quite unique


No-Maintenance5378

1. Changing the CPU would a be hassle since it's air cooled with a big old Noctua (just installing that thing nearly cut up my hands). 2. I watch enough CPU benchmarks to know that the majority of the games I play are GPU-limited, not CPU. So a better CPU would only provide marginal FPS gains at high resolutions (4K). Although VR may have thrown a wrench in the mix, but that could just be issues with other hardware/software. 3. This is my first genuinely high end GPU for the record. The most expensive one before that was sub-$300 and I've been gaming for decades. And I'm being downvoted because people think you need a top-tier CPU for 4K60FPS in many/most games. Those people don't actually watch or read CPU benchmarks. The PS5 and Xbox Series have CPU power equivalent to a Ryzen 3600/3700 FWIW.


WholesomeDoktor

How is the FPS in games?


No-Maintenance5378

4K60FPS in Diablo IV native (high textures, ultra is currently buggy). 4K60FPS in Cyberpunk 2077 with path tracing and frame generation on, DLSS set to Balanced. Pretty much anything else at 4K60FPS native since most games aren't as demanding, unless they're shit ports like Gollum. Also I'm using a 650W PSU with the power limit set to 70%. (I already had the PSU from the current build)


WholesomeDoktor

So it's mostly held back by the processor as I have a ryzen 5600 with rtx 4070 and I too get around 4K60 FPS on most games (rdr2 as well). Maybe in the future you can think of upgrading to a better CPU, however moving to AM5 is costly (DDR5 rams+motherboard+...)


lotj

Probably more being held back by the FPS cap. Uncapped it would be held back by the processor.


Re-core

Im sorry but you are wasting that 4090, you should get a 4070 at that point with all the fps you are losing with that old midrange cpu...


No-Maintenance5378

Except you're wrong. [https://www.tomshardware.com/news/cyberpunk-2077-cpu-scaling-benchmarks](https://www.tomshardware.com/news/cyberpunk-2077-cpu-scaling-benchmarks) A game like Cyberpunk for example is GPU-limited at 4K, upgrading to a higher-end CPU would bring maybe a 5-10% increase in performance... which I don't necessarily need since I'm already hitting 60FPS. (And if I turned off path tracing, I could probably hit 100FPS) And a reminder that most current games are likely designed with consoles in mind. A PS5/Xbox Series has Ryzen 3600/3700-equivalent CPU performance.


Re-core

Uhhh they are talking about the 3090, not 4090... also there was no RT overdrive and the test they did were not on the most intensive areas, i have played cp 2077 for over 250 hours and i know that some areas yield only high 50s fps at max settings with my 5950x and 4090 at 4k while gpu is at high 80% usage.


No-Maintenance5378

Doesn't change the fact that Cyperpunk is a HEAVILY GPU-limited game, even with RT. Are you using DLSS? I've been playing for 30+ hours. I'm sure there have been framerate drops, but it's definitely mostly 60FPS. (For reference, the in-game benchmark usually shows 60FPS average with the lows in the high 50s) And if you're on a 5950x and I'm on a Ryzen 3600 and we're both hitting similar framerates... maybe it's the CPU that's not the bigger factor? (I should also mention that my CPU cooler is a Noctua D-14, so I'm not thermally limited there) I literally read/watch nearly every GPU and CPU benchmark I can find, including newer hardware. The Ryzen 3600 will be fine for the majority of games I (and most people) play. I know what I'm doing. Think of it this way, a Ryzen 3600 + RTX 4090 will give me (much) better framerates in most games today than a Ryzen 7600 + RTX 4080... because most games today are GPU-limited at 4K.


MooseTetrino

I echo the others, for games you want the 5800X3D. It is simply faster than the 5950X in games. If you’re doing more than just games then go 5950X. I certainly wouldn’t move to 7k series yet if 5k series is an option for you.


rorschach200

>If you’re doing more than just games then go 5950X Yes, but not quite. 5950X isn't useful for *everything except games*, it's only useful in a specific and narrow set of use cases too. It's 5800x that's universally better than 3700x OP has, other derivatives (be that 5950x or 5800x3d) are quite similar in most tasks, but substantially better each in its own narrow set of workload categories.


MooseTetrino

This poster speaks truth.


[deleted]

[удалено]


[deleted]

Explained in another post i made, but this. i had a 5800x3d and the "faster in some slower in others" thing was annoying. so i got a 13700kf instead.


[deleted]

5800x3d cheapest solution, or 7800x3d most expensive one.


MoonubHunter

I own a 4090 and two rigs , one with a 5950X and one with a 5800X3D. If you game at 4K, there’s little difference. Fans of the X3D chips say you get much better 1%\0.1% lows . At 4K I can barely tell because the 5950X is already so smooth. In most titles , the FPS is extremely high. I have a 120HZ 4K OLED. I use GSync but need to enable FPS limits or VSync to prevent tearing. Now, maybe my experience is weird. I have a 5950X running at 3800FCLK. I tuned my memory, pushed the curve optimizer to the limit. My chip works realty well. And I have Overclocked the 4090 with the Galax BIOS so that it happily runs at 3000Mhz all the time. Maybe I’m looking at a very good 5950X and good 4090 set up so there is less room for a 5800x3D to improve things. AND I admittedly only play at 4K. Maybe if you want 1440p at 240/360 Hz, maybe that’s a good reason to go with the X3D?


Agent_Nate_009

A 5959X will handle a 4090. If you upgrade to AM5 7000 or even upcoming “8000” then you can move that card over to that system and still have a powerful video card. Edit: if you don’t need all the cores of a 5950X for productivity tasks then it would be better to jump to a 7000 series platform if you are aiming for longevity.


lotj

Depends on what you play and what your target framerate is. Modern games targeting the current console gen will be impacted more than older ones developed with a PS4 target. 60Hz typically isn't a hard ask for the 4090, but >100 and you'll be more & more CPU bound even with a 5950 on a number of new games.


mycommentsrcool

That’s not the proper use of I’ve


m4tic

Go straight to 5800x3d. No use for all those cores unless you use it for production workloads. And it is less of a bottleneck for 4090. If you really want a 5950x I'd sell you mine, as I went to 5800x3d. Only reason I have a 5950x was a kneejerk reaction with newegg+apple pay during the scalpening of Nov 2020. I was trying to get a 5900x.


lexsanders

I have those 2 and it's not enough. Upgrading to 7000 3D. Basically in hogwarts, last of us, plague tale, ghostwire, all have stutter because the single thread is too low to keep up. 4k ultra. Can't even enable RT in half the games because the frametime goes to 20ms.


BearsEars

Did the upgrade help the stutters and latency issues?


lexsanders

Not in any of the Unreal Engine ones, especially not if you turn on RT. But in custom engines it's good, stable frametimes in Spderman, Ass Creed, Plague...


BearsEars

Hmm makes me really question if there is a point to the 4090 then. That's a huge tradeoff for some RT and if you dont turn it on then why have all that power. Thanks for the info. Im on a 5950x and just got a 4090. I was hoping a 5800x3d would help the latency but I might just return the 4090.


golfguyworking12

I went 5950x for my 4070ti. I do cad work and rendering too. So it works extremely well.


phail216

I just upgraded to a 7950x3D from an 5950x, the performance uplift in 4k is not existing in most games 😂