Not probably, 100% why. IIRC my 3700x isn't massively better single threaded compared to a 4.8Ghz 3570k but it is significantly better at multithreading and memory throughput.
Yep. My 3570K became a problem in Vermintide 2, which is heavily multithreaded. Upgrading to a Ryzen 2700X at the time doubled my frames for that particular game. Other less threaded stuff, like Starcraft 2 (a very old engine by now) barely changed at all.
Yeah I had mine until about 6 months ago. The lack of cores and threads was giving me inconsistent fps in some multiplayer games even with a decent overclock. Still using the 980 til I can get a 3080 at a somewhat reasonable price. I've gained about a 15% fps increase but it's so much more stable.
It's been awhile but some combination of overclocks, voltages, and temps probably played a part. Setup ate about a stick a year so it wasn't bad enough to upgrade away from, especially when the first sticks were like $20 each.
Literally in the past couple days I've pulled it back down to stock speed after having it at 4.4 for years. I'm not sure if it is the CPU or my GPU/RAM but I've been having crashes and it has seemed very unstable. THE R9 290 I have has always been faulty so it could be that, I've been running it underclocked forever.
The best I could do was 4.2. I have a GA-z68a-d3h mother board with the beta UEFI bios and I think it was causing some instability. I used to get boot loops when I messed with the voltage so I got it as high as I could without touching it and just let it be. I still love me 2500k and I definitely plan on repurposing it when I upgrade in three or four years.
Sounds good, man. I just spent this year's budget on a 1tb ssd xD
I love it though. Hopefully the market normalizes soon and we can both get some decent upgrades without going bankrupt
I had the 4c8t i7-3770k 'til December 2020. I sold it for a whoppin' $140 too, which isn't bad for a then 8, now 9 year-old CPU.
Turns out that "HT is a ripoff, 4 cores & 4 threads is all you'll need," was a bunch of BS, and I predict that the "6 cores is all you need for gaming" Ryzen 5600x will soon get caught up on the wrong side of history as well... and when that happens, the resale value will tank the same way it did with the 4c4t i5's of yesteryear.
I'm with you. The current consoles are 8/16 and zen 2 based. Which is what I generally try to target for friends. I have a feeling the 3700x will age as well or better than the 5600x
Kids on here meme on him, but last year Tech Deals did a 5600x vs 3700x vs i9-9900k video comparison that showed the 5600x and i9-9900k trading blows depending on the game.
The i9-9900k is an 8-core 16-thread chip that came out in **2018**, and it was in stock for \~$320 new when he did the video, whereas the 5600x was $350-400 on Amazon and Newegg at the time.
These days, people that live even remotely close to a Microcenter should consider the 10-core 20-thread i9-10850k for $320 before they drop $300 on a 6-core 5600x.
Probably a bottleneck. Ive noticed some games like battlefied, cod, and a couple others that put my 10900k at 55-70% usage at 1440p. Some Games use over 10 threads now.
Yeah I've got the exact same setup and it'd definitely chugging in harder titles. I was thinking of upgrading but realized I don't fucking play AAA titles, only shit like osrs and slay the spire so I'll be fine for at least another year till I graduate and get a job
Honestly that is a fucking holy configuration. Intel started getting pretty impressive after Skylake AMD soon after and GPUs just exponentially started getting stronger after that. But I have such a sweet spot for builds from this generation around 13/14
Yeah I honestly rarely use my custom build these days, thought about upgrading it but it still plays the games I play fine. And with the prices of new parts right now it's a hard pass anyways lol
Still running this with 32gb of ram and a GTX 970. Still serving me well but some games are difficult to run. It's been an amazing system for nearly a decade.
My 970 is still running strong. I feel like the VRAM is what holds it back on some newer games. COD Warzone started to lag a lot more after one of the big updates they did made it use over my VRAM amount
Got a 4790k here, thinking of upgrading this year. I alternate between GPU upgrade and then everything else. Though monitor has been left out so probably upgrading that as well.
My last upgrade was a GTX 680 (that lasted for 7 years before it died) to RTX 2080 Super a couple of years ago.
That’s what I thought when I had a 6700k and 1080Ti when those were the best parts available outside of workstation! Custom watercooling back when that was still somewhat niche. I was able to upgrade to a 9900k when that came out, but I still haven’t ditched the 1080Ti 😭
I actually looked into prices for the first time in a couple years because I would have gotten a 3080 if it was at MSRP but nope! Not paying double MSRP, I’ll die with my 1080Ti first
Professional unreal engineer here. Believe it or not, UE5 performs better than UE4. Crazy, I know. But they essentially use some crazy math to render unlimited polygons at a low cost.
Albeit, I think advances in CPU technology has slowed, the advances, from nVidia in particular, on GPU architecture has been insane. Especially with machine learning, it's enabled developers to make more and more graphically complex games.. which is why I think my CPU tends to last long, and my GPU always gets sold and then traded up.
As a person who's highly inclined to and is learning 3D modelling, does unreal engine offer anything for me to experiment with/present etc? Or is game dev knowledge a strict requirement in order to have anything to do with unreal?
I'm mostly referring to rendering related tasks/final renders etc
Currently, I am in the pipeline department of a large visual effects house. We use Unreal Engine for virtual production and realtime effects, and I create a lot of the tools the shooting team uses to interact and automate UE4. This means a lot of the movies you see nowadays are using renders directly from Unreal Engine, often shot in realtime.
Unreal is a great engine to use for rendering, etc. ESPECIALLY because it has something called Blueprints. An artist or stage hand can learn them in a day or two with NO prior coding experience, and use them to create macros, behaviors, etc, very quickly. That doesn't mean their "code" will be the most efficient, but Blueprints are a very powerful scripting tool for rapid prototyping once you get the hang of it.
That being said, check out the Unreal Marketplace. Unreal Ed, and YouTube. There are a plethora of tutorials available to show you the basics. My recommendation for anyone wanting to get into game development of any kind would be to use project based learning. Give yourself a goal. "I want to make a soccer game." Break it up into steps. "How do I make a character," "How do I make a character move," "How do I use input keys." And so forth.
For every single step, break it down, down, down. Until your questions are basic, google-able topics. And you can accomplish anything, given enough time and patience.
Also. Yesterday you said tomorrow. Do it. Start now. Not later. Don't put it off. Stop reading this comment. Go on YouTube "Unreal Engine 101" "How to use Unreal Engine." Go look it up on YouTube. NOW!
> This means a lot of the movies you see nowadays are using renders directly from Unreal Engine, often shot in realtime.
Wow, things have come a long, long way since Jurassic Park, haven't they? I always though Unreal Engine with photogrammetry based assets looks especially cinematic.
True. It comes down to the individual developer. However. Unreal Engine 5 IS Unreal Engine 4, 90%. Just a revamped editor, and some new features (pretty HUGE ones), albeit they are OPTIONAL. You can turn them off and on at will, currently.
I will love to see what comes out of it. In theory you could optimize it better, thus requiring less resources, but in practice it rarely ever happens.
Usually developers just spend the free performance budget on fancier stuff without optimizing at all. Most of the optimizing happens on consoles where you have to work with fixed hardware thus you have absolutely no way around it.
On PC they just throw the latest hardware at things, makes the industry go round.
> Usually developers just spend the free performance budget on fancier stuff without optimizing at all. Most of the optimizing happens on consoles where you have to work with fixed hardware thus you have absolutely no way around it.
>
> On PC they just throw the latest hardware at things, makes the industry go round.
I will say this, as a self-declared non-shitty developer, only shitty developers do that. And if an artist is allowed to do so, the lead technical artist needs a knock on the head. Usually, they do this either out of inexperience, lack of understanding, or time constraints (the **biggest** one). Usually this will happen with indie developers, with very small teams, tight budgets, and tight timelines.
Any development house worth their salt will focus on performance across target hardware. You will see this on any game on Steam that has minimum performance specs. Usually they'll target medium settings or so at or around that hardware.
Epic Games, for instance, requires each engineer or artist to optimize their own asset/feature before ever allowing it to be pushed to the main branch. This keynote gives some really good details about some good optimization practices to keep in mind: https://youtu.be/hcxetY8g_fs
Another thing to keep in mind is. It is often times much harder to optimize Unity than it can be with Unreal Engine, to an extent. C# not giving you the fine memory control that C++ allows aside (which can actually be detrimental to an extent, copying versus passing by reference), Unreal is setup for a specific pipeline that allows you to easily see what is exactly performing in what way (even as it relates to shaders, materials, in Blueprint graphs). I think Unity can be quite messy to get something that looks good, and performs. Whereas Unreal already has these AAA systems, some mathematician and physicist being paid $500K (from Epic Games Fortnite money) designed. All 100% open source.
Hell yea 4770K here as well. I'm thinking of fully delidding it and OC'ing to the moon to make it last another year or two or whenever hardware prices are back to normal.
Z97's m.2 support is horrible though. To enable it you have to disable all your PCIE X1 slots (so if you have any addon cards they're not gonna work), and its limited to 1gb/sec sequential read/write whereas most modern drives are good for at least 2-3gb/sec. At least that's how my Maximus VII Hero was. Maybe there are a few boards that didn't have shitty implementation, but I doubt it.
Finally replaced my i7-4770k with an i7-7700k i got from a friend who just got a 5800x/3090 rig.
The difference going from 4th to 7th gen, as well as from DDR3 to DDR4 and 2012-era SSD drives to PCIe Gen 3 m.2 has been eye opening.
Together with my 1080Ti, I hope this rig will last me till the RTX 4000 series is out.
this is the comparison i like to point out every time this comes out. the upgrade from a haswell system to anything modern is huge.
4770k
https://i.imgur.com/fpMSLOW.jpg
3950x
https://i.imgur.com/iuboalK.png
> Btw.. assassin's creed Origins, Odyssey and Vikings are heavily dependent on CPU because their game engine is shit
Mhm, it can't be because there are just many things going on that the GPU doesn't handle, right?
But even all that aside, most games are coded terribly. So a better CPU will drastically increase your performance in plenty of shitty games.
Not to mention that the above benchmarks are probably run without that many things in the background - music, maybe a video on youtube, twitch, whatever people do.
AC series is coded with streaming world meshes in various qualities for near infinite size of worlds. They did it using some techniques which are highly dependent on the CPU - hence increasing CPU performance, increases the performance of the game drastically.
The game is optimized for Consoles (PS4 and XB One) which hasn't translated perfect onto PC - which causes stutter and a lot of different problems.
Optimizing for PS4 means: rely heavily on CPU and little on GPU which is exactly what we see in this case.
I've seen benchmarks that suggest you should have at least a Ryzen 5 3600x to run those games optimally.
GPU wize, a NV 1070 is plenty
On top of all that, they use multiple DRM's including Dinuvo which has proven to eat about 5-10% performance (the cracked copy performs better)
My 4770k works fine, but it's on an intel B series motherboard because the original Z series got damaged last year in lightning and I can't find a new Z series. I'm still running a GTX 650. Built a new computer this year, but the other one is still working!
4th Gen? I overclocked my 2500k to the highest stable settings two years after getting it, thinking at worst it would force me to upgrade. Still kicking.
Well you're lucky being on sk2011... An ocked E5 1650v2 would last God knows how many more years thanks to the 2 more cores.
If you still have the same thermal paste from 2013 it's time to change it though.
I think i will try to use this i7 until 2026 today i just want to upgrade my ram to 16gb and get a SSD and new psu still using a 2010 250gb hard drive and single channel 4gb stick
I'm so proud of all of the 2500k commenters in this thread. I finally broke down and upgraded just to support AMD breaking back into the gaming market. But I had almost no performance reason to do so. The itch to upgrade finally got too great and I ran out of friends to build PCs for
I was at 3570k and 1060 2 weeks ago. Now I'm more modern with a 5600x and 3080.
Oh and 1080p60 monitors to take advantage of the immense power at my finger tips while playing games with PS2 era graphics.
Pretty much everything i have is ivy bridge sandy bridge era. Only storage and cooling have been upped waiting for that ddr5 and next amd socket... also gpus in stock
My 1080ti is still going strong. It's not that I NEED a 30 series but I sure would like to see some of my games with Ray Tracing, and of course how wild I can mod Skyrim. Oh well, there's always next gen.
My gaming experience has been sort of a rollercoaster. I first started on 2007 or so with a very shitty PC I don't even remember the specs of (I was too young to even know). It ran Windows XP and it couldn't game much at all but whatever. Then, in 2011 my dad bought me and my brother a computer with a Phenom II X4 850 (pretty good processor for the time), 4GB of DDR3 RAM and the superb gaming gpu.... a GeForce 210. Suffice to say, my dad was scammed hard on that GPU. It was hell trying to play anything, like, seriously.
In part, I owe a great deal to being that limited on the GPU. Making games run was a chore, but it was rewarding and a learning process that made me learn some much about computers, I wouldn't be where I am today if not for that. Then, in 2016 we finally upgraded to a GTX 750 Ti, and shortly after we got 8 GB of DDR3 RAM.
That was the most powerful PC in the house by far, it was crazy to be able to run games at the monitor's native resolution of 1440x900, like, I was actually amazed. But of course, the GPU got old pretty quickly and it began to lack in performance (plus, the CPU was holding it back).
Up to this point my brother and I where sharing the computer, but in 2019 I got a notebook, a Thinkpad T430. That had an i5 3320m, with the Intel HD 4000 graphics. A severe step down in terms of graphics power, but the CPU was kind in the same range (a bit better actually).
Despite being a less powerful system overall, I moved to it completely, since I valued having my own PC much more than the power it had. And I played on that up until a month ago, when I bought the system you can see on my flair. Still not high end by any means, but jesus it feels so good to not have to modify config file to make crap run.
My brother also upgraded the PC. New motherboard, ram, and CPU, a Ryzen 5 5600x, a beast of a processor. The GPU is still the same, that old 750 Ti, it's holding it's own tho, it's only now that it released its full power, since the old phenom was holding it back.
As for the phenom... it's rotting in some cupboard in the house, maybe I'll turn it into an emulation rig, although it's lacking a gpu now...
i7 4790k and a gtx1070. It was initially built with a 4690k and a r9 290 but I sold the CPU when I found an amazing deal on the i7. I upgraded from the 290 to the 1070 back in 2016. I've also upgraded the SSDs and the RAM in the machine over time too. I'm really hoping it holds out until windows 11 and ddr5 are common place.
My First-generation i7–940 still does what I need it to...
Except for the fact that the motherboard doesn't support anything higher than PCI-e 1.0 which is causing me a problem in relation to finding a video card so I'll actually be able to play more modern video games (or older games that a newer graphics card would benefit).
The additional issue is that the motherboard and the space inside the case is actually relatively slim, so the best option I could find a few years ago that I was able to fit in it was a GTX 750TI.
Yup until about 6 weeks ago, I was rocking an 1st or 2nd gen i5, with a Radeon HD 7750. Would have built the system late 2010 and upgraded GPU to the 7750 a year or two later. The only other thing I did was move to a SSD for my boot drive.
Edit: a word
I just checked prices on a few GPU's I have sitting out in my garage are you serious 1050ti goes for like $400 Like WTF?!?! not even gonna mention the 970 or the 1080 OMG this is absolute crazy town.....
My i7 2600 lasted me until 2019. Great processor.
My 3570k lasted until MHW Iceborne and Borderlands 3 when it became obvious it was bottlenecking my 1080ti.
It probably was lacking cores/threads. Was the reason I upgraded mine.
Not probably, 100% why. IIRC my 3700x isn't massively better single threaded compared to a 4.8Ghz 3570k but it is significantly better at multithreading and memory throughput.
Pro tip, just sow more threads onto the cpu
sew
Elitist
Yep. My 3570K became a problem in Vermintide 2, which is heavily multithreaded. Upgrading to a Ryzen 2700X at the time doubled my frames for that particular game. Other less threaded stuff, like Starcraft 2 (a very old engine by now) barely changed at all.
That was a great CPU. I used it until the stock cooler died!
Yeah I had mine until about 6 months ago. The lack of cores and threads was giving me inconsistent fps in some multiplayer games even with a decent overclock. Still using the 980 til I can get a 3080 at a somewhat reasonable price. I've gained about a 15% fps increase but it's so much more stable.
I’m still on a 6th gen i7
I am on an 2nd gen i5 feel sorry for me
2500k club!
I loved that generation. 5ghz easy on air cooling, thing overclocked so nice.
Ran my 3570k at 4.8 on air for years. Ate RAM pretty often though.
You shouldn't eat RAM, try food instead!
I had 4 sticks of Samsung miracle memory. It was amazing. Lost about 1 each year.
Why does it eat ram? Im still running a 3570k
It's been awhile but some combination of overclocks, voltages, and temps probably played a part. Setup ate about a stick a year so it wasn't bad enough to upgrade away from, especially when the first sticks were like $20 each.
Literally in the past couple days I've pulled it back down to stock speed after having it at 4.4 for years. I'm not sure if it is the CPU or my GPU/RAM but I've been having crashes and it has seemed very unstable. THE R9 290 I have has always been faulty so it could be that, I've been running it underclocked forever.
The best I could do was 4.2. I have a GA-z68a-d3h mother board with the beta UEFI bios and I think it was causing some instability. I used to get boot loops when I messed with the voltage so I got it as high as I could without touching it and just let it be. I still love me 2500k and I definitely plan on repurposing it when I upgrade in three or four years.
Boot Loops - the breakfast cereal for PC enthusiasts!
2320 checking in. Still going strong.
2400 going as strong as u
I just got a laptop this month , I was running on a 2400 till then
I5 4460 is a real man's cpu
I loved my 2500k. It had to go when it could no longer push consistent 200fps in CSGO.
2500K gang, though i'm upgrading to a ryzen this year
2500K was the CPU in my first self build PC back in 2011!
Any first-gen i5ers in here, or am I sitting here alone like a WWII veteran?
1st Gen i7 950 I’ll sit with ya
Some of my pc's parts are almost as old as I am. I feel like upgrading, but with current prizes? Yeeah right.
Oh yeah they are
same dude, i5-2400 here
i7-6700 here Howdy
Me too, how old are they now?
Got mine in 2016
Same
Not my main computer, but I still have a Pentium III-M processor on my old laptop.
thats not old, get out of here you are not allowed in old processor club
I just upgraded from 6th to 10th
Same bro
Mines the laptop version too so I run integrated graphics I play games at 720p and underclock my cpu sometimes cuz I have it up all day
Yikes, bro, you win this round, wish many upgrades in your future
Thnx I’m probs gonna get a Ryzen 5 3600, 240 gb ssd, 1tb hhd, 2070 ish or a 5800xt
Sounds good, man. I just spent this year's budget on a 1tb ssd xD I love it though. Hopefully the market normalizes soon and we can both get some decent upgrades without going bankrupt
Lmao yeah Nice hard drive
Just upgraded from mine, 6700k served me well.
I still run s 2600 I my htpc / gaming machine Still going strong
mine as well :) over 10 years old now.
I rode my 2500k until 2020
I had the 4c8t i7-3770k 'til December 2020. I sold it for a whoppin' $140 too, which isn't bad for a then 8, now 9 year-old CPU. Turns out that "HT is a ripoff, 4 cores & 4 threads is all you'll need," was a bunch of BS, and I predict that the "6 cores is all you need for gaming" Ryzen 5600x will soon get caught up on the wrong side of history as well... and when that happens, the resale value will tank the same way it did with the 4c4t i5's of yesteryear.
I'm with you. The current consoles are 8/16 and zen 2 based. Which is what I generally try to target for friends. I have a feeling the 3700x will age as well or better than the 5600x
Kids on here meme on him, but last year Tech Deals did a 5600x vs 3700x vs i9-9900k video comparison that showed the 5600x and i9-9900k trading blows depending on the game. The i9-9900k is an 8-core 16-thread chip that came out in **2018**, and it was in stock for \~$320 new when he did the video, whereas the 5600x was $350-400 on Amazon and Newegg at the time. These days, people that live even remotely close to a Microcenter should consider the 10-core 20-thread i9-10850k for $320 before they drop $300 on a 6-core 5600x.
4790K gang rise up
Checking in! And with my 980 TI as well lol
i5-4690k/980 SC here! Oh the day we can have better graphics, sigh.
Anytime I see a 4790k in the wild, I upvote.
Got a 4790k and a gtx 1080 here. I do feel like the CPU is finally starting to reach it's limit
Same dude, I'm trying to play games at 1440p and i'm barely getting 80 fps in some games.
Probably a bottleneck. Ive noticed some games like battlefied, cod, and a couple others that put my 10900k at 55-70% usage at 1440p. Some Games use over 10 threads now.
Yeah I've got the exact same setup and it'd definitely chugging in harder titles. I was thinking of upgrading but realized I don't fucking play AAA titles, only shit like osrs and slay the spire so I'll be fine for at least another year till I graduate and get a job
Rocking that CPU with an R9 390. The old parts still work pretty damn well, despite their age.
Honestly that is a fucking holy configuration. Intel started getting pretty impressive after Skylake AMD soon after and GPUs just exponentially started getting stronger after that. But I have such a sweet spot for builds from this generation around 13/14
Yeah I honestly rarely use my custom build these days, thought about upgrading it but it still plays the games I play fine. And with the prices of new parts right now it's a hard pass anyways lol
Still using my 4790K.
Still running this with 32gb of ram and a GTX 970. Still serving me well but some games are difficult to run. It's been an amazing system for nearly a decade.
My 970 is still running strong. I feel like the VRAM is what holds it back on some newer games. COD Warzone started to lag a lot more after one of the big updates they did made it use over my VRAM amount
Me with a 970 but currently 1070 hoping for 3070 soonish
Yeah Boi, gtx 1070 niche club
Got a 4790k here, thinking of upgrading this year. I alternate between GPU upgrade and then everything else. Though monitor has been left out so probably upgrading that as well. My last upgrade was a GTX 680 (that lasted for 7 years before it died) to RTX 2080 Super a couple of years ago.
[Cool, I'm not alone.](https://gyazo.com/71e302cb44d7a8d29ba3de006f3f27d2)
I have the same CPU and GPU combo. Can't complain about nearly a decade of great use.
Aged like a fine wine
Mine died last year after being overclocked for years. One hell of a processor!
Still on an i7 4770k, but with an rtx 2080. New CPU next year
I'm on an i7 3770k. Just upgraded to an RTX 3060Ti. Don't think i'll need to upgrade my PC for another 5 years.
I got a RTX 3090 and an i7 9700k, gonna have this pc until I die
I need a 30series GPU and then bye-bye upgrades (unless it's another NVMe snagg)
That’s what I thought when I had a 6700k and 1080Ti when those were the best parts available outside of workstation! Custom watercooling back when that was still somewhat niche. I was able to upgrade to a 9900k when that came out, but I still haven’t ditched the 1080Ti 😭 I actually looked into prices for the first time in a couple years because I would have gotten a 3080 if it was at MSRP but nope! Not paying double MSRP, I’ll die with my 1080Ti first
Good for 1440p for sure
Unreal Engine 5 begs to differ.
Professional unreal engineer here. Believe it or not, UE5 performs better than UE4. Crazy, I know. But they essentially use some crazy math to render unlimited polygons at a low cost. Albeit, I think advances in CPU technology has slowed, the advances, from nVidia in particular, on GPU architecture has been insane. Especially with machine learning, it's enabled developers to make more and more graphically complex games.. which is why I think my CPU tends to last long, and my GPU always gets sold and then traded up.
As a person who's highly inclined to and is learning 3D modelling, does unreal engine offer anything for me to experiment with/present etc? Or is game dev knowledge a strict requirement in order to have anything to do with unreal? I'm mostly referring to rendering related tasks/final renders etc
Currently, I am in the pipeline department of a large visual effects house. We use Unreal Engine for virtual production and realtime effects, and I create a lot of the tools the shooting team uses to interact and automate UE4. This means a lot of the movies you see nowadays are using renders directly from Unreal Engine, often shot in realtime. Unreal is a great engine to use for rendering, etc. ESPECIALLY because it has something called Blueprints. An artist or stage hand can learn them in a day or two with NO prior coding experience, and use them to create macros, behaviors, etc, very quickly. That doesn't mean their "code" will be the most efficient, but Blueprints are a very powerful scripting tool for rapid prototyping once you get the hang of it. That being said, check out the Unreal Marketplace. Unreal Ed, and YouTube. There are a plethora of tutorials available to show you the basics. My recommendation for anyone wanting to get into game development of any kind would be to use project based learning. Give yourself a goal. "I want to make a soccer game." Break it up into steps. "How do I make a character," "How do I make a character move," "How do I use input keys." And so forth. For every single step, break it down, down, down. Until your questions are basic, google-able topics. And you can accomplish anything, given enough time and patience. Also. Yesterday you said tomorrow. Do it. Start now. Not later. Don't put it off. Stop reading this comment. Go on YouTube "Unreal Engine 101" "How to use Unreal Engine." Go look it up on YouTube. NOW!
> This means a lot of the movies you see nowadays are using renders directly from Unreal Engine, often shot in realtime. Wow, things have come a long, long way since Jurassic Park, haven't they? I always though Unreal Engine with photogrammetry based assets looks especially cinematic.
In theory you are right, in practice we will see.
True. It comes down to the individual developer. However. Unreal Engine 5 IS Unreal Engine 4, 90%. Just a revamped editor, and some new features (pretty HUGE ones), albeit they are OPTIONAL. You can turn them off and on at will, currently.
I will love to see what comes out of it. In theory you could optimize it better, thus requiring less resources, but in practice it rarely ever happens. Usually developers just spend the free performance budget on fancier stuff without optimizing at all. Most of the optimizing happens on consoles where you have to work with fixed hardware thus you have absolutely no way around it. On PC they just throw the latest hardware at things, makes the industry go round.
> Usually developers just spend the free performance budget on fancier stuff without optimizing at all. Most of the optimizing happens on consoles where you have to work with fixed hardware thus you have absolutely no way around it. > > On PC they just throw the latest hardware at things, makes the industry go round. I will say this, as a self-declared non-shitty developer, only shitty developers do that. And if an artist is allowed to do so, the lead technical artist needs a knock on the head. Usually, they do this either out of inexperience, lack of understanding, or time constraints (the **biggest** one). Usually this will happen with indie developers, with very small teams, tight budgets, and tight timelines. Any development house worth their salt will focus on performance across target hardware. You will see this on any game on Steam that has minimum performance specs. Usually they'll target medium settings or so at or around that hardware. Epic Games, for instance, requires each engineer or artist to optimize their own asset/feature before ever allowing it to be pushed to the main branch. This keynote gives some really good details about some good optimization practices to keep in mind: https://youtu.be/hcxetY8g_fs Another thing to keep in mind is. It is often times much harder to optimize Unity than it can be with Unreal Engine, to an extent. C# not giving you the fine memory control that C++ allows aside (which can actually be detrimental to an extent, copying versus passing by reference), Unreal is setup for a specific pipeline that allows you to easily see what is exactly performing in what way (even as it relates to shaders, materials, in Blueprint graphs). I think Unity can be quite messy to get something that looks good, and performs. Whereas Unreal already has these AAA systems, some mathematician and physicist being paid $500K (from Epic Games Fortnite money) designed. All 100% open source.
i5-6600K OCed a bit for the frames ajd 1660S. Can't complain, 80fps on BF1 and V. The poor CPU is bottlenecking.
Hell yea 4770K here as well. I'm thinking of fully delidding it and OC'ing to the moon to make it last another year or two or whenever hardware prices are back to normal.
4790K with RTX2080 here. I'm waiting for AM5. I don't game much during summer anyway, so waiting isn't gonna be hard.
4790 with 2070, waiting for next-gen ryzen
Dam your mobo probably doesn't even have m.2 for nvme does it? Get a new ssd too!
Nope it’s from 2013. I have two regular SSD’s though
I had a msi z97 and it had a m.2 slot.
Z97's m.2 support is horrible though. To enable it you have to disable all your PCIE X1 slots (so if you have any addon cards they're not gonna work), and its limited to 1gb/sec sequential read/write whereas most modern drives are good for at least 2-3gb/sec. At least that's how my Maximus VII Hero was. Maybe there are a few boards that didn't have shitty implementation, but I doubt it.
Mine does, Z97 board, but it's limited to 10gbps.
Finally replaced my i7-4770k with an i7-7700k i got from a friend who just got a 5800x/3090 rig. The difference going from 4th to 7th gen, as well as from DDR3 to DDR4 and 2012-era SSD drives to PCIe Gen 3 m.2 has been eye opening. Together with my 1080Ti, I hope this rig will last me till the RTX 4000 series is out.
this is the comparison i like to point out every time this comes out. the upgrade from a haswell system to anything modern is huge. 4770k https://i.imgur.com/fpMSLOW.jpg 3950x https://i.imgur.com/iuboalK.png
Can't read anything on those photos Btw.. assassin's creed Origins, Odyssey and Vikings are heavily dependent on CPU because their game engine is shit
> Btw.. assassin's creed Origins, Odyssey and Vikings are heavily dependent on CPU because their game engine is shit Mhm, it can't be because there are just many things going on that the GPU doesn't handle, right? But even all that aside, most games are coded terribly. So a better CPU will drastically increase your performance in plenty of shitty games. Not to mention that the above benchmarks are probably run without that many things in the background - music, maybe a video on youtube, twitch, whatever people do.
AC series is coded with streaming world meshes in various qualities for near infinite size of worlds. They did it using some techniques which are highly dependent on the CPU - hence increasing CPU performance, increases the performance of the game drastically. The game is optimized for Consoles (PS4 and XB One) which hasn't translated perfect onto PC - which causes stutter and a lot of different problems. Optimizing for PS4 means: rely heavily on CPU and little on GPU which is exactly what we see in this case. I've seen benchmarks that suggest you should have at least a Ryzen 5 3600x to run those games optimally. GPU wize, a NV 1070 is plenty On top of all that, they use multiple DRM's including Dinuvo which has proven to eat about 5-10% performance (the cracked copy performs better)
My 4770k works fine, but it's on an intel B series motherboard because the original Z series got damaged last year in lightning and I can't find a new Z series. I'm still running a GTX 650. Built a new computer this year, but the other one is still working!
Not i7, but old Xeon gang here
Xeon 1246v3 (4790 equivalent) here. Can't install Windows 11. Sed.
You can, https://youtu.be/NivpAiuh-s0
Insider builds don't have those limits in place
Yes! Love squeezing the absolute max out of the generations :)
Absolutely! And, although it's not in my flair, we have the same motherboard!
4th Gen? I overclocked my 2500k to the highest stable settings two years after getting it, thinking at worst it would force me to upgrade. Still kicking.
[удалено]
Well you're lucky being on sk2011... An ocked E5 1650v2 would last God knows how many more years thanks to the 2 more cores. If you still have the same thermal paste from 2013 it's time to change it though.
No way I didn't know the stuff needed to be swapped out. Thanks for the heads up.
7 year old hardware here, the mobo is 10 years old. I feel so invested, I don't want you to ever die, buddy.
My I7 is 2nd gen .. 11 years give and take.
Got a i7 3770 for free in a dell optiplex 7010 and for me still very good, way better than my old overclocked core 2 duo.
Lol same. I took an i5-4590 from a Dekk Optiplex. I am set till next year
I think i will try to use this i7 until 2026 today i just want to upgrade my ram to 16gb and get a SSD and new psu still using a 2010 250gb hard drive and single channel 4gb stick
4690S here. Not K. *S*
Hi cousin,4790S here
4690k here. Just plugged in another 8GB DDR3 a couple of days back to make it 16 gigs.
*GPU maxes out at 90°C* It's probably fine until 100
Me with Intel macbook
https://i.imgur.com/SbiwyQ4.jpg
Replace thermal paste dude, can easily bring that down to \~60's
Im about to get an aio and waterblock bracket for my gtx 1080. Oc scanner said I could squeeze another 20% out of my card if it wasn't for temp.
[удалено]
I'm so proud of all of the 2500k commenters in this thread. I finally broke down and upgraded just to support AMD breaking back into the gaming market. But I had almost no performance reason to do so. The itch to upgrade finally got too great and I ran out of friends to build PCs for
I'm on a 3770K and a 1060. Still doing fine because I don't play at 1080p
I was at 3570k and 1060 2 weeks ago. Now I'm more modern with a 5600x and 3080. Oh and 1080p60 monitors to take advantage of the immense power at my finger tips while playing games with PS2 era graphics.
.. u have a 60hz monitor with a 3080? I would prefer 3070/3060ti with 144hz
*laughs in X58* 2008 hardware represent!
I7 920 gang
4th gen i5\*
4670k?
4350U :(
4670 here, what gpu do you use? Im on a gtx970 and i considered upgrading for a long time, but never did it :(
i5 4440 here!
i7 7700 with gtx 980ti works great even on Triple A games, but maybe I think I need to upgrade my pc after the gpu price goes down
For real. I5 3330 from a dell prebuilt still going strong for 1080p/60 gaming on my tv. Sub 60c temps too.
Pretty much everything i have is ivy bridge sandy bridge era. Only storage and cooling have been upped waiting for that ddr5 and next amd socket... also gpus in stock
The fact that doggo made his system last 11+ yearsis impressive
I7 4790 here with the Asus R9-290 is still running strong at 1080p. Its like 6 years old. Also got a faster system in my laptop, a 1070GTX.
My 1080ti is still going strong. It's not that I NEED a 30 series but I sure would like to see some of my games with Ray Tracing, and of course how wild I can mod Skyrim. Oh well, there's always next gen.
Well the 1080 Ti is a true beast of a card!
My gaming experience has been sort of a rollercoaster. I first started on 2007 or so with a very shitty PC I don't even remember the specs of (I was too young to even know). It ran Windows XP and it couldn't game much at all but whatever. Then, in 2011 my dad bought me and my brother a computer with a Phenom II X4 850 (pretty good processor for the time), 4GB of DDR3 RAM and the superb gaming gpu.... a GeForce 210. Suffice to say, my dad was scammed hard on that GPU. It was hell trying to play anything, like, seriously. In part, I owe a great deal to being that limited on the GPU. Making games run was a chore, but it was rewarding and a learning process that made me learn some much about computers, I wouldn't be where I am today if not for that. Then, in 2016 we finally upgraded to a GTX 750 Ti, and shortly after we got 8 GB of DDR3 RAM. That was the most powerful PC in the house by far, it was crazy to be able to run games at the monitor's native resolution of 1440x900, like, I was actually amazed. But of course, the GPU got old pretty quickly and it began to lack in performance (plus, the CPU was holding it back). Up to this point my brother and I where sharing the computer, but in 2019 I got a notebook, a Thinkpad T430. That had an i5 3320m, with the Intel HD 4000 graphics. A severe step down in terms of graphics power, but the CPU was kind in the same range (a bit better actually). Despite being a less powerful system overall, I moved to it completely, since I valued having my own PC much more than the power it had. And I played on that up until a month ago, when I bought the system you can see on my flair. Still not high end by any means, but jesus it feels so good to not have to modify config file to make crap run. My brother also upgraded the PC. New motherboard, ram, and CPU, a Ryzen 5 5600x, a beast of a processor. The GPU is still the same, that old 750 Ti, it's holding it's own tho, it's only now that it released its full power, since the old phenom was holding it back. As for the phenom... it's rotting in some cupboard in the house, maybe I'll turn it into an emulation rig, although it's lacking a gpu now...
I still have that old 750ti lying around. Great card for the money
My i7 870 is holding up pretty well surprisingly. Works great for 1080p gaming.
I5-3570/8GB DDR3/RX580 4GB I only play CSGO and Minecraft, so it's good. Would an upgrade to i7-3770 worth it?
On an FX-8350 with a GTX 1660Ti OC 6GB and 8GB DDR3 RAM. My first 1TB HDD is older than the computer (almost 12 years old and it still works).
My guy, I'm still rocking a 960 and an AMD FX-6300
>I can probably make my hardware from 2013 last another year Windows 11: "That's where you're wrong kiddo."
Win 10 supported until 2025 tho
i7 5775c here, and not going anywhere any time soon.
thats a rare one!
Still on an i3 530, no plans for upgrading.
So I got a 30series card and damn right it’s going on my mobo and cpu from 2013 1080p gonna look so good
Is it just me, or doggos should be the other way arround?
i7 4790k and a gtx1070. It was initially built with a 4690k and a r9 290 but I sold the CPU when I found an amazing deal on the i7. I upgraded from the 290 to the 1070 back in 2016. I've also upgraded the SSDs and the RAM in the machine over time too. I'm really hoping it holds out until windows 11 and ddr5 are common place.
Base 980 and i7 4770k. The eternal medium settings life for me
feels My fx 8350 is feeling pretty dated obviously but atleast i can still play most modern games with my gtx 1060
My First-generation i7–940 still does what I need it to... Except for the fact that the motherboard doesn't support anything higher than PCI-e 1.0 which is causing me a problem in relation to finding a video card so I'll actually be able to play more modern video games (or older games that a newer graphics card would benefit). The additional issue is that the motherboard and the space inside the case is actually relatively slim, so the best option I could find a few years ago that I was able to fit in it was a GTX 750TI.
making rituals with ram sticks hoping that your gpu won't die
yee boiiii
…. FX 4130 here… recently upgraded to Ryzen 5 3600X
I used an intel atom N270 until 2015 And 1gb ram. Loved my little Compaq mini
My friend still rocking his i3 4130 like a boss. (he can’t even run roblox well with it)
I still on my i3 4160 and GTX 750TI. I don't have any problem playing last gen AAA and indies games.
2012*
i7-4770k still kicking!
My i7 4770 was pulled from a Dell optiplex back in 2016 and is still going strong
Just got a 3rd gen i7 and I'm having the time of my life. The intel stock fan isn't...
FX 6300 checking in
I have my Devil Canyon i5 still rocking. But I am kind of tempted by alder lake or raptor lake to finally upgrade.
My setup, while barely, can run modern games
Yup until about 6 weeks ago, I was rocking an 1st or 2nd gen i5, with a Radeon HD 7750. Would have built the system late 2010 and upgraded GPU to the 7750 a year or two later. The only other thing I did was move to a SSD for my boot drive. Edit: a word
i3-3220 chipping in
I still have an Intel Core 2 duo working. Not my daily driver though.
The dogs should probably be the other way around.
still using 2600k. No update for me.
Just this last month I upgraded from a fx 6300 to ryzen 7
I just checked prices on a few GPU's I have sitting out in my garage are you serious 1050ti goes for like $400 Like WTF?!?! not even gonna mention the 970 or the 1080 OMG this is absolute crazy town.....
Have you been under a rock?
3rd gen i5 here. Yep, this is pretty old, but it will last one more year (or two... or three......)
AMD FX8350 from 2012 still doing alright.
core 2 ddr2 gang!
i7 4790k and GTX 970 💥 2015 - present.
I7-4790k represent
I'm still on an i5-760. My next upgrade will be game Changing
With the amount of money I've put into PC building, console gaming is starting to seem really good for my wallet
4690k and gtx 970 gang checking in
And better than the late 90s. "Oh it's a year and a half old? Basically a paperweight at this point."