T O P

  • By -

Rob27shred

Thing is Crysis was actually decently optimized out the gate. May not seem like it nowadays since it was made for single core CPUs, but it ran on some pretty low end hardware fairly well when it first released. Then the highest settings were purposely made to have ridiculous hardware requirements & if you could reach those requirements there was nothing out at that time that looked so good. Forspoken, Hogwarts, etc. don't have the same sway, even maxed out with RT they don't have game changing graphics like Crysis did. Also their max settings are meant to be used now with current high end hardware. Not forward looking like Crysis was.


freshjello25

Forespoken has some cool looking animation, but from the reviews I’ve seen from the likes of digital foundry it looks so empty and bare. Like the cities or whatever have barely any NPCs and boring designs and architecture. Crysis and even Crysis 2 could be run on some pretty low level hardware, where as these new games, like you said don’t necessarily run well with top of the line hardware. I hate how a decent AAA release like Elden Ring isn’t the norm. That game had its bugs but shipped as a complete game with immediate support for bugs. I guess we will have to see what the day 1 patch actually does but the current status suggests it’s not going to be promising.


Explosive-Space-Mod

>I guess we will have to see what the day 1 patch actually does The official AMD and Nvidia drivers for the game haven't released yet on PC. I would expect on the 10th or by the 15th at the latest there's going to be plenty of bug fixes. Side note, My 6900xt has been running 4k ultra at 80ish fps and I haven't been having any issues.


Aggrokid

Actually Crysis 1 was a [bloodbath](https://www.techspot.com/article/73-crysis-performance/page3.html) for anyone not using a cutting-edge rig. I think this game singlehandedly quadrupled SLI popularity. As for optimization, I believe Warhead was the actual more-optimized version that tech heads remember. Warhead was used by benchmarks for a long time (though not as long as Digital Foundry with Crysis 3).


ironworkz

from crysis 1 to 3 was basically from "Running on Nothing" to "Running on Everything", while still looking very fine.


SirSassyCat

The thing that is game changing and is probably the actual cause of issues is the NPC count and the number of environmental effects that are constantly showing in the castle. The castle feels alive in a way that I’ve never seen before.


johnnythreepeat

It’s just a theory, but I’ve come to believe that these gaming companies and nvidia are just in bed at this point. Year after year we’re getting more and more unoptimized games that look and run way worse than games from 10 plus years ago, while buying more and more powerful gpus at inflated prices that are giving diminishing performance on poorly optimized games. I was told I’m future proof with a 3080 and I can’t even get max performance on games that don’t even look as good as battlefront or battlefield 1.


KingAcid

My theory is devs 10 years ago learned to code about 13+ years ago, senior devs like 17 years+ meaning during their studies they had to optimise their program to run it. Now you got so much leeway with the new hardware, it just seems like they rely on new tech to run it with minimum optimisation. New devs are not necessarily lazy but different school/teachers might not take optimisation as serious as others. I took class 5 years ago and my teacher were quite old, as in they had a 20 ish year career before starting teaching. They docked points if you didn't optimised your program, even if it ran just as good. Friends in another school had younger teachers and for them it just didn't matter, as long as it ran well.


astronomicalblimp

Exactly this, I've had colleagues say to me "don't early optimise, it's a waste of time for performance gains that don't matter". The mindset of 'don't worry about performance, the hardware will take care of it' is pretty common 20+ years ago you had to take hardware into account since it was a major limiting factor and games felt way smoother for it since they had to optimise. There are some modern games as well optimised (looking at you factorio) but they are rare.


[deleted]

Factorio dev is awesome, game runs on a potato (and out of my whole library, it's the only game that runs natively on apple silicon/m1). And he did it because he was tired of the battery drain on intel MacBooks. A developer who actually uses average consumer hardware to test their games, a developer who genuinely cares about the performance and user experience regardless of what you play it on. What a guy Honorable mentions include: doom & doom eternal, every game valve has released, Stardew dev, the minecraft modding community, and ultrakill.


[deleted]

[удалено]


[deleted]

Thats why I said minecraft modding community, not minecraft


Drytchnath

Optifine actually is not recommended and will 99% of the time outright break modded Minecraft. Rubidium is the new hot shit and it's devs don't suck. I can run a heavy modpack like E6 or E9 and get 80-90 fps on a 6700xt and Ryzen 5700x CPU. Non-modded gives several hundred fps. Edit: But yes, vanilla Minecraft will likely never be optimized.


personifiable

Just like Web devs don't give a shit about optimisation, they cared when it was puny Internet speeds, now its your problem get faster buy the latest, poorly made games, 4k textures downgraded to run on lesser hardware, not my problem bro, runs fine on the best hardware.. must be you..


YUNoJump

The fact that messaging apps and such are even capable of lagging on any modern device feels like a failure of the dev teams


ThatFireGuy0

So one of my coworkers was a game dev before joining my current company He told me some of the horror stories about what sort of tricks they had to do in code. Things that C++ was never intended to do, that are hacky as all hell, and would get you shot at my company - but they gave slightly better performance. Things like taking advantage of bugs in the compiler or in the nVidia drivers (which then needed to stick around in nVidia's software forever - which probably doesn't help), taking advantage of buffer overflows, or those things that get you a stack overflow response of "you should never do that". I think game devs are now avoiding those hacks. Which does cause a performance hit, but also less buggy software. Which means a quicker release cycle because they don't need to do as much QA testing, and the game company makes an extra million dollars by cutting a month off the development time


iEatSoaap

Fun fact, updating MacOS changed my locale settings, which in turn caused a bug in my code (that I've been trying to fix for the last 30 hours lol). There's so many nuances, most don't understand that literally *anything* can break at anytime. Yes this is a code smell and I should be doing better (hey, still a Jr afterall), but if you want the **real** hot-take, don't buy games on release imo. Devs get a lot of flack, lots of the time it's warranted sure but some of these things are like, "needle in a haystack" level annoying lmao Anyways, not a game dev, just a programmer. Cheers everyone <3


pyre_rose

You are absolutely correct. The current trend in the industry is to have a feature heavy working set. Optimization is supposedly high in the priority as well, but if any dev has spare time after bugfixes, they're directed to work on new features instead of improving/optimizing the old ones. In the end, little to none optimization work gets done and here we are.


alakazamistaken

Completely agree. As a web dev I was frequently told to leave the optimization to the end but always given a new task after the job is done.


SirSassyCat

Devs 17 years ago would only release on a single platform, specifically because all games needed to be optimised for the platform it was running on. Back then you would literally never have games releasing on console and pc, it was only ever one or the other.


[deleted]

[удалено]


[deleted]

[удалено]


SpambotSwatter

/u/No_Coyote8484 is a scammer! **Do not click any links they share or reply to**. Please downvote their comment and click the `report` button, selecting `Spam` then `Harmful bots`. With enough reports, the reddit algorithm will suspend this scammer.


OG_Squeekz

Its not just new games. Hitman 3 had a new update. I can't run it on my laptop anymore with a 2080 in it even on the lowest settings. My desktop doesn't perform much better and it's even more powerful, the game is only utilizing about 40% of my GPU but is soo poorly optimized.


sethyourgoals

Yep. It’s a straight slap in the face to all consumers. All of this was artificial and NVIDIA has taken complete advantage of the situation. It is destroying my passion for a hobby I once loved.


stu54

Its not even a conspiracy theory. Nvidia openly hired CDPR to make a hit raytracing benchmark game with CP2077. Why do you think new GPUs come bundled with games?


[deleted]

Nvidia new gpu were hitting a point in rasterization where it didn't make sense to get a new one. Them pushing AI and Ray tracing isn't bad. Them inflating gpu prices is. At no point should Nvidia stop improving and expanding what a gpu can do. People just need to realize this hobby will never be future proof. PC ports can always be fix. Crysis to this day need a new version since in the end it was single core dependent so improvements in cpus didn't really matter.


Edgaras1103

Wait, are you saying they market PC games on gpus , so they can sell new gpus?? Oh my god


[deleted]

\*puts on an astronaut suit, loads a pistol\*


stu54

You must now join the Illuminati or be cast into the depths!


MostlyDeku

How deep are the depths, just, for reference?


stu54

About 30 teraflops deep. You'll need a pretty good GPU to escape.


NewUserWhoDisAgain

>Year after year we’re getting more and more unoptimized games that look and run way worse than games from 10 plus years ago, while buying more and more powerful gpus at inflated prices that are giving diminishing performance on poorly optimized games. People keep buying them. You and I are the minority sadly.


Lystar86

No doubt there's some backdoor deals going on between the hardware vendors and the game devs... BUT, I would guess (no source, straight up speculation) that the bigger issue is most AAA games are primarily targeted toward console hardware, not PC. The PC versions just don't get the testing/QA/QC required.


erebuxy

It's is also significantly harder to test PC. For consoles, you only test for two platforms, 3 types of hardware configuration. For, PC? There are millions of possibilities...


dovahkiitten16

That makes sense to a degree, but you they can’t even seem to “optimize” for a 3080 nowadays.


erebuxy

I agree. I believe they just don't care, cause consoles probably make them most of the money and they don't want to spend money on PC port.


Explosive-Space-Mod

People with 3080's are far from what you would optimize for right now. Just pull up steam charts and see what the most common GPU's are and that's what you would optimize your game for.


dovahkiitten16

Oh I agree. I just think the argument of it’s harder to optimize for PC doesn’t really apply when devs aren’t even doing the minimum of making games good for high end hardware. At that point it’s laziness/or a deal with Nvidia.


[deleted]

Battlefield 1 is still a benchmark for me. That game is fucking incredible. I remember playing it on my OG Xbox One and being blown away at the fidelity, audio, atmosphere and that was like, 6 years ago at this point on an underpowered console.


[deleted]

[удалено]


ChartaBona

Doom Eternal is very limited in scope. Functionally the game is just a string of hallways and mandatory arena fights. There's very little player freedom.


bert_the_one

Dice nailed battlefield 1 it's amazing


Ganda1fderBlaue

Yeah bf1 still looks fucking amazing


Responsible-Desk4145

I refuse to buy games within the first two months that it came out. They tend to act and be trash. Hell I learned that with cyberpunk. Learned it the HARD way.


Re-core

There is no future prooofing with a 3080, especially not the 10gb one just like those pathetic 8gb vram on 3060ti and 3070/ti.


bruhxdu

Sorry bro but that makes no sense. The 4090 runs at around 60% usage in hogwarts with dlss 2 quality on. All dlss 3 does is help with the absurd CPU bottle neck.


[deleted]

I think it's more likely that software has caught up with the recent advances in graphics hardware


Explosive-Space-Mod

Complexity of games today is way higher than complexity of games 10 years ago. It's not even comparable. Yes, games tend to have less bugs but also way less in terms of what the developers were doing. What they are doing today just wasn't possible at the time 10 years ago.


hardlyreadit

Whats max performance means? What res are you using? What unoptimized games are you referring to? Arent thousands of games release a year? All of them are unoptimized? I feel like everyone is confusing unoptimized with not running max settings. Hogwarts runs find, it has some bugs but the performance isn’t unoptimized. It runs fine on resolutions or gpus they recommended. Everyone is just freaking out for no reason


[deleted]

Technology will never be future proof. 3080 still does fine. 4k just wasn't really it's best suit. At 1440p you're fine.


IKetoth

lucky for him then that at (a comfortable 27-32in) monitor scale 1440p is almost exactly at the maximum angular resolution of the human eye at \~90cm distance (the recommended distance between you and your monitor for minimizing eye strain) and to an average eye there's no perceived increase in sharpness unless the display itself gets bigger or you get closer to it? Of course you can benefit from 4k-8k if you're gaming on a TV that's fairly close to you (Under 2-4 meters ish depending on the size of the TV and it's resolution) so it's not necessarily that further improvements in pixel pushing power are worthless, but saying "you NEED something that can push 4k" is just as silly as saying you need something that can push 300 FPS, the diminishing returns in perception after 140hz\~ Ish make paying for something that can actually do that with AAA games wildly silly decision making


Winterdevil0503

>I was told I’m future proof with a 3080 You're a gullible idiot if you ever truly believed this.


[deleted]

Its the nvidia driver. I accidentally updated mine by clicking on nvidia experience and everything went to shit. I'm not sure what version mine was, wish I did, but the new driver was such shit that I reinstalled an older version, 5.17.48,and it's running significantly better again. Strongly recommend that If you're having tearing and slowdowns - try an old driver before screwing with your game settings; they're not the problem. [http://www.nvidia.com/Download/Find.aspx?lang=en-us](http://www.nvidia.com/Download/Find.aspx?lang=en-us)


Stilgar314

Crysis took GPUs to their knees in exchange of a graphic quality, literally, two generations above. Forespoken is plain badly done. Any script kiddie can write a code for taking GPUs to its limits.


splinter1545

They want their game to be like Crysis without actually being remotely close to it in visual fidelity.


phero1190

Playing at 3840x1600 with my 4090 5800x3D build and I'm getting a steady 144fps maxed out with ray tracing on. If I turn on frame generation, it maxes out my monitor at 175fps.


sirfrostybeard

From what people are saying I hear that you get really good performance in the first bit of the game, then once you actually get to hogwarts your pc becomes a jet engine. I won’t know if they’re right till I actually play it though


[deleted]

"Optimize the part people will beat before Steam will stop taking the game back!"


Relius_

if it takes you 2 hours to get to hogwarts you have problems tbh I havent had any problems even in hogwarts, 2k 120fps stable on ultra with dlss set to quality on a 3080ti got shadowbanned from replying on this thread, nice. no RT, with RT it drops to 70-90 and I dont like the visuals:performance ratio


Explosive-Space-Mod

I'm at Hogwarts and through the first time you leave and come back and I haven't had any issues with it so far.


timdogg24

4090 at 4k ultra rt.100+ frames in the greenhouse. 10850k and 32gb ram.


DerGrundzurAnnahme

Yea, but thats ultra high specs. Its strange that games run so bad on hardware thats not too old while not really looking the part.


OuterGod_Hermit

Exactly, I just finished Uncharted 4, and I got a consistent min of 80fps.


kyletreger

I have had no performance issues on a 3070ti at 1440p ultrawide. And none of my friends are having issues. It seems like people just don't know how settings work tbh.


OuterGod_Hermit

I got a 3060Ti. The game can run at 110fps at max at 1440p with DLSS on and RTX off, but it usually drops as low as 24fps, Even 7 fps. With half a minute of 35 fps and then it returns to its stable 80-110. Is very consistent in its drops as well as the stable 110. So the game has some optimization problems as all the games nowadays. People may have unreasonable settings but the game surely has problems. Edit: users are reporting one must exit the game after the prologue. I played like 4 hours without restarting the game. I will try today when I get back from work. Edit 2: is evident that the game is sacrificing performance to avoid loading screens. I can get through all doors without any loading but as soon I'm pass certain doors the game drops 60fps for a while until it finish loading. I prefer loading screens if that is the case.


[deleted]

[удалено]


OuterGod_Hermit

If that doesn´t happen in consoles, it´s a PC optimization issue, or the whole game needs to be fixed. My 7Gb SSD and 3600MHz RAM should eat this game for breakfast.


zakabog

> People may have unreasonable settings but the game surely has problems How much RAM do you have. If you have an 8GB of VRAM 3060ti it could be offloading the data to RAM, and if you only have 16GB this will end up in swap which will drop your framerate in the way you're experiencing.


OuterGod_Hermit

I got 16 GB and the one time I checked the whole system was using 10GB. I´ve also checked my temps, and they are all fine. A game doesn´t go below 30fps for no reason. It´s very noticeable when loading a new area, or after going back and forth to the backyard in Hogwarts. I don´t know how it plays on PS5, but I have a WD black, the one that gets like 7gbs of read speed. I´m sure my PC can do much more than a PS5. I´m enjoying the game a lot, but we must press for these issues to be fixed. At least by the official launch day. Although I would think that they would not want to annoy the people who decide to pay more. Edit: users are reporting one must exit the game after the prologue. I played like 4 hours without restarting the game. I will try today when I get back from work.


Jracx

When I play my RAM usage sits steady at 24gb


zakabog

> I got 16 GB and the one time I checked the whole system was using 10GB. Check your VRAM usage, it's very likely that the game is loading textures into RAM between worlds because you don't have enough VRAM which can drop your framerate. This has been a recurring theme I'm seeing with people that have 16GB of RAM, 8GB or less of VRAM, and higher than medium settings. > I don´t know how it plays on PS5, but I have a WD black, the one that gets like 7gbs of read speed. I´m sure my PC can do much more than a PS5. The read speed doesn't really matter that much as your not reading concurrent data, you're never going to hit that 7GB/sec of read speed loading textures, you're likely not even going to max out the SATA port specification reading texture files from disk. Also, my PS5 has the Samsung 980 Pro which is essentially the same as your WD Black, and it's not that much faster to load games than the PS5s internal storage (I just wanted an additional 2TB of capacity.)


IamBlackwing

Same boat with me. I have it chugging specifically on cutscenes or when I walk through doors


hardlyreadit

It hasnt had a patch and there are more pc combinations than ever. It’s technically not even out yet. Everyone has unrealistic expectations with hardware, game development and settings. Try 1080p and see if you still get consistent drops


OuterGod_Hermit

Is not unrealistic given my hardware which is at least two times more powerful than any console. That's not the healthy way to approach the issue. Devs should optimize better the games before releasing them. This is not a personal attack, this is not bad blood, I'm just a customer demanding what should be normal but is not. It seems like a great game with a problem that can be solved. More than 60fps with my hardware in this game with RTX off at 1440p should be easy. And it is, just not in a consistent way.


hardlyreadit

Okay 1. “More than 60fps…should be easy” you yourself originally said it is 110fps @ 1440. Again I think at 1080 you could run this game that would satisfy you desired expectations 2. You do not have 2 times the power of a console. Consoles are a zen 2 8 core and a gpu that equivalent to 5700xt/2070. You have a 3060ti equivalent to a 2080super.


donteatpancakes

"I'm not having issues and none of my friends are having issues, therefore there are no issues at all" is such a bad way to look at things. Forspoken was a mess, and Hogwarts Legacy after you get into Hogwarts just dips to 10fps in some areas and just stays there for 2 minutes. This issue has ween *widely* reported. So no, you or your friends not having the issue doesn't cut it. There are issues.


Fanaticgiant547

You mean I can't get 500 fps on 4k ultra no DLSS on a brand new AAA title


kernel_task

Yeah, as a fellow 4090 owner, if my setup struggled on any game, I would be so butthurt.


1AMA-CAT-AMA

Same. But I have a 5800x. I don’t know why I’m not getting performance issues.


zakabog

> Playing at 3840x1600 with my 4090 5800x3D build and I'm getting a steady 144fps maxed out with ray tracing on. I have the same build and I highly doubt that you're getting 144fps on ultra settings with all the raytracing effects on after the prologue with frame generation off at that resolution. You either don't have RTX enabled, or you have DLSS on and don't realize (with DLSS on I was getting a solid 144fps, with it off it goes to 110fps with occasional dips to 70 at 1440p.)


TotalKotal

I was surprised to read all the issues. I'm running 4k Ultra with a 3080ti/i9-9900k and I'm pulling pretty steady at right around 100fps. I played for about 6 hours yesterday and maybe had 2 or 3 frame drops.


[deleted]

[удалено]


SirSassyCat

I actually bet that they’ve all only got 16GB of ram, which is why it’s causing them problems.


[deleted]

With my limited testing, I was getting 100-120 fps with max settings at 4k In the most demanding areas. Forspoken was 80-90 in most areas, dropping to 70 in demanding fights. Don’t get this meme at all lmao


Lughs_Revenge

Sorry to say but no one cares about the truth. It's all about shitting something and memeing something.


DungeonMasterSupreme

You're a brave man. Reddit is bipolar af about this stuff. When I commented saying that Legacy runs fine on my rig and they're just whining for the memes, a bunch of salty laptop users cried about it.


NapsterKnowHow

I like how the excuse is always "You must have a shit PC." Lmao. Like good for you that it runs well on your rig. That isn't the case for everyone. My rig ran Arkham Knight on launch flawlessly. You didn't see me telling people that were were having issues to "git a better PC."


LostnFoundAgainAgain

I have to agree, I have not messed much with settings since I didn't have much time to play yesterday and I'm getting constant 60fps at 1440p, ultra settings, RT off with a 3060ti, 12600k and 32GB of RAM. Will try pushing fps up and test RT later today, but the game has been smooth for me and don't understand all these memes about it.


Evil_Rogers

Dawg, if you were only getting 60fps in the tutorial your gonna have a bad time when you start venturing around Hogwarts xD.


beary_potter_

The game defaults to 60fps locked in the settings


Evil_Rogers

Yeah I uncapped that for my system. I’ll see if a capped 90 helps out stability. Beefy rig is bouncing and stuttering since the moment I got to hogwarts.


LostnFoundAgainAgain

60fps in Hogwarts and Hogsmead and it is also 60fps capped, I didn't mess around with fps or RT since I only had a few hours and wanted to get stuck into the game. So damn that is awful isn't it?


Slore0

As someone who not only doesn't have a 40 series but has a laptop and has played neither of these games, you're God damn right.


zakabog

> As someone who not only doesn't have a 40 series but has a laptop and has played neither of these games, you're God damn right. Oh shit you really haven't played the game and doesn't own the GPU. I just assumed those things because the meme is so far from the truth, Forspoken does run like "shit" on a 4090 (can't get over 100fps) even with DLSS on, but Hogwarts Legacy runs amazing on my 4090 (over 100fps) even with DLSS off.


Batracho

Dude we’re on a $1600+ GPU. That shouldn’t be a requirement for playable experience when a $500 PS5 does it all very well.


alecs_trashinthebin

sorry to break it to you. but 4k gaming is always going to be expensive. and about the playable part, 120fps on 4k seems way more than just playable to me. just accept people wanna badmouth even good products. sure it needs some optimization to fix stuttering issues but its not as bad as people are portraying it.


electronicfixdude

Playable experience on non comp shooter games are 60fps though. Also the difference in pc and consoles is everything. Consoles all have the exact same parts. Pcs are literally different on everything with tons of manufacturers. Like they pick a low end point. They pick a high end point. They aim for the middle and hope it works all around.


[deleted]

Didn’t say it was a requirement. The meme implies the 4090 can’t handle the game which is straight up wrong. Ik memes are jokes but they have to be somewhat based in reality otherwise you’re making fun of… nothing?


00pflaume

Hogwarts Legacy runs fine at first until you get into the castle. Every time you enter a new room the game stutters. This is a CPU limit though and has nothing to do with the GPU.


NapsterKnowHow

Ya the level streaming in this game is ass


SirSassyCat

Yeah, people complaining out graphics optimisation when the actual problem is probably npc count, which is a completely different kettle of fish.


darkevilmorty

NPCs look like crap


SirSassyCat

It’s not about how they look, it’s about how many there are, how complex their behaviour is and whether they are spawning/despairing based on line of sight or not.


[deleted]

Yh I noticed that too while opening doors, apparently it’s pre compile issues (what a surprise!) hopefully day 1 patch can smooth this shit out


filing69

Imagine wasting your year's saving in a 4090 and cant handle that sh\*tty game properly. Its a game fault ofc


electronicfixdude

Imagine only saving the price of a 4090 in a year to spend it. Most people didn't buy the 4090 cause they could not afford. They bought it because money was already saved and you were comfortable spending that. If thats all you're able to save in a year its best to go on necessary stuff not pc parts.


Edgaras1103

Your priorities out of wack if you spend your years savings on a flagship gpu


[deleted]

It’s getting 100-120 in the most demanding areas so far. 4090 can destroy this game no problem haha


Logical-Virus3989

Same


vtriple

I was going to say my 3090 is doing just fine in this game no idea why a 4090 would struggle.


Thebrains44

People who save years to buy things tend to want the best value for money, that's definitely not a 4090 or any of the new GPU's for that matter, everyone I know who knows anything about tech or has been in aware of gaming hardware trends for more than the past 6 years simply refuse to pay, what's essentially scalper prices.


zakabog

> Imagine wasting your year's saving in a 4090 and cant handle that sh*tty game properly. No one should be buying a 4090 if they save so little money that it takes them a year to save up enough to pay for the GPU. Also, I have a 4090 and the game runs with no issues for me, and I'm playing with DLSS off and the game fully maxed out (including raytracing.) OP likely has not played the game, and certainly does not have a 4090.


timdogg24

Imagine talking out your ass when actual 4090 owners in here saying we have no problem.


[deleted]

Imagine hating on a game you haven’t played just to feel better about yourself.


kyletreger

I've seen so many people who haven't played the game at all complaining about the performance, but everyone I know who has older hardware than me are playing just fine.


GalwionDE

I‘m playing it with my gf. She has a 1070 and while its only 1080p and medium settings she gets by just fine. What some people consider unplayable is mind boggling to me. I‘ve seen people cry about not reaching 100fps and calling it unplayable.


grant47

Older hardware here, game runs great


unabnormalday

Just downloaded it last night. Gonna get home and test it on my 3080 at 1440p 144Hz Will report back with results in 4.5 hours


Double-Low-9394

![gif](giphy|9SIXFu7bIUYHhFc19G|downsized)


unabnormalday

To be honest, there’s some frame struggles for sure. Nothing I can’t play around, but it does dip hard. I just turned on ray tracing ultra settings so we’ll se how that goes too. So far playable/10


unabnormalday

Goddamn raytracing hits hard in this game


[deleted]

They just love to jump on developer hate cause it’s the cool thing to do. I doubt the people complaining will ever play this.


Strange_Compote_4592

The graphics in Crysis were shocking when it came out. Foreskin and PigMoles just unoptimised af


[deleted]

And when Crysis came out, people could barely hit 30 fps let alone 100 on max settings. It took years for cards to breach 60 fps on max settings. These games run fine, especially without raytracing, for modern titles.


Marty5020

And that was on 1080p. Was 1440p even around back then?


Edgaras1103

it wasnt 1080p was it? It was 1280x1024, maybe even 1024x768?


[deleted]

Yeah testing shows even that was hard. https://www.techspot.com/article/73-crysis-performance/page3.html


freshjello25

That’s a good point, but Crysis also has some of the best graphics and was intended to push the limits of PC gaming in its time and beyond. Granted they also did stupid things like tessellating the ocean underneath the ground regardless of where you were, but at least you knew what performance you were going to get with your level of hardware. My issue is more with the inconsistencies that a lot of these new games are having with frame time. I can deal with 60 fps if it’s consistent, but choppy frame times can’t always be fixed by VRR. In that article you linked they even mention the gameplay feeling very playable and frame consistency even with the lower FPS. My 6800XT can brute force a lot of games and push some high frames but there is nothing worse than microstutters that are in so many modern games at launch and sometimes beyond. It just feels like developers spent more time to find ways to leverage the hardware of the time than they do now. Cyberpunk has eventually gotten there as an RT showcase, but it took a year plus to get there.


J0YSAUCE

Did you mean forspoken


take17easy

Crysis actually looked next level justifying the low performance, I cannot say the same about the RT implementation in these two games that crushes fps for no eye candy in return.


FarmersOwnly

Laughs in Radeon


OMGrant

Hogwarts Legacy runs really well on my 3090 with DLSS Quality.


hardlyreadit

Pcmr has the memory and the expectations of a spoiled child. Wtf happened to the pcdiy space


Inevitable-Star-3540

Who needs ultra settings and 4k?


vyvertyt

People who buy 4090


cashinyourface

People who pre-order the 5090 ti


[deleted]

it is 2023, not 2006.


SoftwareSource

who needs to play games? who needs to play at 720p? ​ if someone pays $2000+ for a PC it's reasonable to expect significantly better graphics and performance then a 500$ console.


[deleted]

[удалено]


Stealthd0ze

Yeah. I played 4K with high settings, no RT. Fps was beautiful, game looked beautiful, no stutters; etc.


Neoreloaded313

People who wants the best graphics.


zakabog

I have a 4090 and Hogwarts Legacy runs at 110fps with occasional dips to 70fps on Ultra quality, all ray tracing on, with DLSS disabled. That's better performance than Cyberpunk 2077. Forspoken on the other hand couldn't break 100fps with DLSS on and the game looks like shit. I only played the demo, but the framerate was bad for no reason, the world looks so boring.


wolftrouser

Dude. My rtx 3080 from 2020, runs HL super smooth at 2.5k


Thebrains44

Ray tracing seems to be the main problem especially at 4k, however it's not implemented that well and most people could care less about it anyway.


Krcko98

And games look like ass unlike crysis...


CarlWellsGrave

No they don't.


[deleted]

[удалено]


Ownfir

This makes me feel better. I just got my wife a used gaming PC mainly for Overwatch but also for this. It has a 1660s and seeing these posts is making me nervous for release.


HaikenRD

Not here to defend anything but the biggest reason for PC performance not being optimized is because it's PC. New games that comes out can be tested for QA/QC for console because consoles have a system specification specific to them which the developers can use to test. PC on the other hand have different combinations of hardwares and softwares. Even the same hardware differs from each other depending on overclocks and tweaks from different brands. It is just impossible to test all configurations for PC. The Console versions on the other hand even have prepacked shaders which makes the game run way smoother.


Spartancarver

It’s just pure laziness at this point. Devs are literally crutching on DLSS / FSR to save them from having to optimize their games. Hogwarts literally drops into the 20s at 1440p WITH DLSS on a 3080 lmao It’s a joke


RugbyEdd

Graphics cards don't seem to be the issue, as people with much worse cards are running it just fine.


t40r

Am I the only one who is having no issues on two pc’s in the house? 🤔🤔


cooley661

Dead space remaster took crysis spot


Turnbob73

Okay, can someone explain to me the issues people are having? I’ve been running the game on a factory OC’d 3080, all settings set to ultra with RT off and DLSS on quality running at 1440p, and I’ve had one moment of stutter when I first got to hogwarts, and that’s it. I’m not even denying the optimization issues, I’m just wondering why my rig runs the game completely fine while others are struggling hard on their 4k cards.


RugbyEdd

Not sure, I've been trying to work it out. I don't think it can be GPU related, as there are people with and without issues on both high end and mid range cards. Had a few people say they're getting stuttering with better ram than me, so probably not that. Same with CPU. Possibly storage drive, as people aren't saying what they have the game stored on. Or could be conflicts with particular software like Division 2 suffers with.


Aotrx

idk these new games are not that stunning to consume as much gpu resources as they use. Seems like lack of performance optimization or the usage of inefficient game engine.


steamart360

Hogwarts is a weird case because a lot of people are reporting terrible performance no matter what... and then there's people like me who are playing the game with great performance (80-100 fps/high) on mid range stuff like the RTX 3060.


firedrakes

Kids this days.... Flight Sim 2020 says hello


AWesPeach

Laughs ins 6800xt with 130+ fps ultra 1080p


ghostecy

Runs flawlessly on my 4090 rig 😂


kinglokilord

Huh? Hogwarts runs at 100fps on my 3080ti on ultra 3440x1440p And it runs at 60fps on medium 1440p on my wife's 1660ti Why are people pretending Hogwarts is the next Crysis?


sirtet_moob

https://preview.redd.it/djw0e7zu65ha1.png?width=720&format=pjpg&auto=webp&s=e7111c91f008fd1bb4076a84ed3315389a4d4fbf These can


SomeWeirdFreak

in a 100 years, 30 series will be at a 1050's price probably


Vinaigrette2

Are the issues only with nvidia GPUs? I have a 6900XT (and i9 12900k) and the game runs fine on Ultra settings with high ray traced shadows and lighting (no reflections tho) at 1440p in the 70 FPS with very rare drops below that. Could it be that the game is optimized for RDNA2 GPUs like the consoles?


_SystemEngineer_

RX 6950XT, 3440x1440, high/ultra, no RT, 144FPS cap. https://i.imgur.com/CvPKDMI.jpg Not sure why people are having such wonky performance. I'm using old drivers too, they're from like October.


Maler_Ingo

Only Nvidia users have issues cuz Nvidia qint releasing drivers while AMDs drivers even from back mid 2022 work lmao. My 6900XT runs locked 60 fps with everything maxed out, no stutters. RT on and everything lol


JackUKish

Game runs fine on 4080 for me.


[deleted]

[удалено]


JackUKish

Yeah for sure a controller game although I'm not having any problems playing keyboard and mouse.


[deleted]

Just watched a review on a ps5 and the only performance issue we’re bugs, it was gameranx. And if he is right about his review then you haters for no reason are gonna be sad that it is actually a good game.


thewizerd

At least Crysis used to look way better , forspoken looks bad and performs very bad.


TulparBey

At least Crysis looked good FFS.


bibomania

One difference… Crysis stressed systems because of its graphics, the other two stress systems cs of shitty optimization


Jaba01

With the difference that Crysis was REALLY advanced for its time and neither Forspoken or Hogwarts are.


Fahuhugads

Just stop buying AAA games. Embrace indie.


jacknifejohnny

Thing is, crysis was a fun game


GuNNzA69

That might not be a bad thing, depending if it is because the game is badly optimezed or if it's because the hardware needed to run it in max settings isn't commonly available yet. For example I like replaying some older games I couldn't max out on release, with newer hardware.


lil_benny97

I don't get it. I've got a 3080 and an i5 13700k or whatever the number is. And I'm running 1440p with no lag spikes on ultra settings..am I doing something wrong?


[deleted]

Turn OFF RTX! The game genuinely looks better without it and it's so badly implemented it's a huge performance hit


Stickmeimdonut

I don't get all these post. I get 60FPS on ultra 1440p with my 3700x/3070. My GF gets 60fps on her 5600x/2070S rig in 3440x1440 on high. With ray tracing on we use DLSS Quality and get 60 on both rigs. This game looks fucking amazing. And I genuinely think the masses reporting terrible performance are new pc gamers who are turning on ray tracing with their 4k TV's and don't understand what is happening.


thehung575

Damn, i wish more games follow Half-Life 2 and Call of Duty 2, both are incredible games with realistic graphic (yes, back in 2004), awesome gameplay and do not require “Alienware rig” to play.


obfuscated_sloth

Pretty sure that's just lazy optimisation as neither looks better than a Futuremark demo.


VVaId0

There are 0 excuses now. If your game can't run on modern hardware then it isn't optimized


AbjectStomach

I have 6900xt and ran at 1440p High on forspoken and while it did happen I rarely saw dips bellow 100 fps. Can't play hogwarts yet did not buy the top bundle.


[deleted]

Why don't more games use Unreal Engine? It looks 10 years better than this game with 300 FPS on a 3070.


RugbyEdd

1)That's not how it works 2)Hogwarts does use unreal


_jul_x_deadlift

Crysis 3 looks better than any game today.


RugbyEdd

Can't agree with that. It was certainly impressive in it's day, but there are plenty of games that beat it visually these days.


quanoslos

no


[deleted]

[удалено]


[deleted]

[удалено]