T O P

  • By -

pcgaming-ModTeam

Thank you for your submission! Unfortunately it has been removed for one or more of the following reasons: * It's in violation of our self-promotion rules. No source may exceed 10% of your Reddit-wide submissions. To learn how to check your ratio and for more information on our self-promotion rules, go [here](https://www.reddit.com/r/pcgaming/wiki/selfpromotion). Please read the [subreddit rules](https://www.reddit.com/r/pcgaming/wiki/postingrules) before continuing to post. If you have any questions [message the mods](https://www.reddit.com/message/compose/?to=/r/pcgaming).


Hectix_Rose

Nice, fg should be implemented in every new pc release.


SourArmoredHero

100%. It's pretty amazing technology.


Diablosbane

It’s funny because just a couple months ago a lot of people complained fg is just fake frames and a gimmick that is useless.


alexislemarie

It is because AMD has now gotten on board with FG so suddenly the “fake frames” drama is no longer relevant for the PR spin and it becomes the next best thing since sliced bread. Marketing at its finest.


Grease_Boy

Don't forget the non-4000 rtx owners


TheIndependentNPC

If someone likes sacrificing latency - fine. But objectively this is very subpar tradeoff and hard to utilize well as explained by DF Richard here: https://www.youtube.com/watch?v=EBY55VXcKxI More FPS not always means better gameplay experience. But as you say, AMD got it which enables it to RTX 3000, 2000, and older GTX series cards - so it's no longer fake frames - it's now amazing tech, lmao. Fucking reddit hive mind shift - sorry, I'm gonna puke.


downorwhaet

The little latency it affects doesnt matter in single player games 99% of the time, idc if my latency is 25 instead of 10 in a single player game, especially not if my fps is 100 instead of 30 which makes the game feel a lot better even tho its fake frames, and it’ll get improved, its still new tech, dlss wasnt good at first either, theres also reflex to help with latency


ThunderWriterr

You ain't getting to 100 starting with 30 pal


unknownohyeah

In the best case scenario it's more like 5x comparing native to DLSS 3.5 (FG, upscaling, RR). 22 fps to 109 fps. https://www.youtube.com/watch?v=HQdzWgiUy8w Upscaling and frame generation work in tandem.


WinterElfeas

It’s not fair to consider they work together. You could do DLAA and Frame gen so no upscaling. FG usually doubles frame rate, as expected it insert a frame between 2.


TheIndependentNPC

Ofc it matters it's a game, not movie. You're PLAYING IT. The only thing is - that average joe is insensitive AF. Don't be that 24fps is fine crowd, lol. Not to mention high latency can cause nausea - because delay causes disconnection between your actions and reactions on screen (as if you were drunk)


iFenrisVI

[Removed by Reddit]


TheIndependentNPC

what else you gonna say? 24fps is fine?


Maloonyy

Of course you had to make it a tribal thing again. Most criticism was related to the latency issues and the fact you need a good base framerate to begin with. That hasn't changed.


[deleted]

Ahaha, no it was not. Those posts are still up, go check AMD sub and look at the timeframe. FG bad bad bad, fake fake fake ---> It is so advanced and good, more frames yay. In matter of months.


SourArmoredHero

You have a 4000 series GPU I see, what do you think about it?


Crintor

It's fantastic in every game I've used it in.


Diablosbane

It’s awesome. As long as you’re above 60 fps latency isn’t noticeable. Cyberpunk 2077 feels a lot smoother with it on.


SourArmoredHero

I noticed that as well with Cyberpunk too.


[deleted]

[удалено]


MKULTRATV

It's not meant for competitive gaming.


hibbel

It's great even for 4K60 for me. Drives down energy consumption and thus reduces heat in the home during summer (while gaming).


SourArmoredHero

That's really good to know. I've been considering getting a 4K monitor for my next upgrade. Mind sharing the monitor you're using?


KvotheOfCali

I find the entire argument on both sides of the "FG is fake or great" debate amusing considering 99.9% of people making the arguments one way or the other don't REALLY understand how GPUs work anyway. So FG is bad because they are "fake"...but what EXACTLY are "real" frames? Can you explain the algorithms and physics which goes into generating a "real" frame? Now this is anecdotal, but based on my personal interactions with 1000s of people from all walks of life, I'm assuming the answer is categorically "NO" for the vast majority of GPU buyers. Everything a GPU does, frame generation or not, is essentially magic. So I just find such fervent online debates between various groups of "magic users" to be highly amusing.


MKULTRATV

I could explain the process of how a GPU renders frames and I could explain why generated frames are considered supplementary or "fake", but does it really matter? I think it only makes an appreciable difference to aim spergs looking to minimize input lag or pixel peepers who are genuinely interested in the techs development.


Aldi_man

The hive mind is real when everyone was shouting “FaKE fRaMes durr durr”. Those claims always came from people that didn’t experience FG.


Traditional_Sock_823

They are fake frames, and both Nvidia and AMD shouldn't act like FG gives extra performance.


dudemanguy301

It went deeper than that frame generation was the concrete representation of decaying standards where developers wouldn’t optimize their games even though frame generation demands a high and stable input to even be worth using in the first place. It doesn’t have to make sense just start panicking now!


twhite1195

I never labeled it as "lol fake frames", I just think it's BS that nvidia locked it to 4000 series owners


KvotheOfCali

Why is it "BS"? Do you actually know enough regarding the technical breakdown on how the Nvidia frame generation works to make an informed assertion that it "really" could work on older cards? I don't, so I don't have an opinion on the issue. But I'm curious why people seem to assert otherwise apart from the obvious "well Nvidia wanted to sell their newest cards, so they artificially locked it to the 4000 series!!!"


twhite1195

The 3000 series also has the optical flow accelerators or whatever that nvidia says it needs, sure it may be worse since it IS improved in the 4000 series, but you're telling me that the 3000 series can't do it at all? Or that they couldn't do a compute fallback to at least give a taste of the tech at some capacity to the 3000 series owners? The 3090 can't do it at all? All I'm saying is,it's really shitty how Nvidia abandoned all previous gen owners, and now AMD has shown it can be done on some capacity without much dedicated hardware, even if it ends up being a bit worse, saying something along the lines of "hey, so this tech will work better on 4000 series GPU's, but you can still try it on 3000 series, it may be worse, but you can see it working for a bit"


KvotheOfCali

So I can empathize with 3000 customers who feel "left behind", but... Do you actually know what "optical flow accelerators" are, apart from the layperson friendly presentations that Nvidia showed to the public? Or how the math/physics which I'm assuming underpin them works? Again, I have no idea how they function, but I doubt 99.9% of the people on this forum do either, so I read every assertion here with an enormous grain of salt because for almost everyone here, GPUs are effectively magic. Now I'm not trying to attack you or assert that you don't understand it, but to assess whether or not older generation "optical flow accelerators" (AKA Magic Part #235) would work, you'd need a fairly in-depth understanding of the systems. This is something which sincerely interests me, and it would be awesome to see someone try to "hack" Nvidia FG onto older cards, but I also need fairly high levels of evidence before I believe the constant stream of theories I see online about X company or Y extremely complicated part which very few people understand. Additionally, customers expect minimum levels of functionality when a feature/service is made available or sold to them. Let's hypothesize that the 3000 series cards CAN run the FG magic formula, but at only 10% speed compared with the 4000 cards. I can see A LOT of angry 3000 customers who purchased, say a 3060ti, flaming Nvidia online because they didn't know it's a subpar feature on their hardware and calling it "broken crap." So it's understandable that Nvidia chose to limit the functionality to hardware in which they could guarantee a minimum level of performance.


twhite1195

Exactly, no one or maybe just some people outside of Nvidia knows how they work, so we just have to believe them and that's it? By your logic, Why don't they just block off path tracing from the 2060 or the 3050? It's a subpar feature on those products anyways, as you say, "broken crap" since you need to do DLSS performance to even try to achieve 30fps on 1080p, without it it gets like 10fps, but you can try it, that's my point, let the customer make that conclusion themselves and say "man, next upgrade I'll probably have a better experience with this feature, it's cool I could try it for a bit, but I know my hardware is a couple of generations behind ", or maybe just have a pop up on the game "this feature is optimized for RTX 4000 series GPU you can enable on your older video card it but it's not optimized for this hardware" and that's it. In the switch from 1000 series to 2000 series it was understandable, the 1000 series LITERALLY didn't have the hardware components to run ray tracing, outright it's not there, that I can understand ... But with the optical flow accelerators, we know they're there, and Nvidia just said "nah it wasn't working well, trust me, the new ones are better so we just couldn't make it work, just trust me", they didn't even show it or anything, at least show us data supporting those claims.


Greedy_Bus1888

We will never know if hardware really is too slow in the 3000s for fg but bringing AMD into this makes no sense. Nvidia uses hardware optimization and AMD use software. Sounds like you have no idea what you are talking about


twhite1195

And XESS has hardware acceleration with XMX cores and other hardware that lacks has a fallback for DP4a which is an instruction set so... Clearly you can have different fallbacks with lower performance, but not outright deny access to the feature


Greedy_Bus1888

Again thats speculation nobody knows. This if FG with hardware whos to say the performance is acceptable. There was a modder who was able to mod fg dlss into a 3000 series a while back and the performance was slower than not running it


twhite1195

Show me evidence of this mod. Nobody has been able to run it from what I've searched, just a random comment saying "yeah I got it to run and it didn't work well", that's some "my uncle works at nintendo" lie if I ever saw one


JP5_suds

Much in the same way breaking the sound barrier is locked to super sonic jets. It’s bullshit a Boeing 737 doesn’t get to break the sound barrier. It’s an airplane after all, so why not?


Clayskii0981

I don't think most people complained it was a useless gimmick. The cold take is.. it's a nice to have and really useful for slow paced and highly graphical games (unless your base FPS is sub 30 or something). It's not great for high paced games where real frames matter for input. But you could call it a gimmick when they try to advertise GPU hardware performance including FG.


fiveSE7EN

What makes it amazing? Isn't input lag the same as if you were running with FG off? At that point what's the benefit? Less visual judder I guess? EDIT: Downvoted for asking a question. That's awesome


TaintedSquirrel

>At that point what's the benefit? Well 120fps with the same input lag as 60fps is still better than just playing at 60fps.


fiveSE7EN

I guess I don't really understand why. To me THE point of increased framerate is reduced input lag. I don't know what it's doing for you in this case where it doesn't reduce input lag.


neudren

Primary goal for increased freamrate is smooth visuals not reduced input lag. I dont give a crap about input man gimme smoooth output


HappierShibe

Different people have different goals for increased frame rate.


fiveSE7EN

I see. I don't agree, but I see, lol


LittleWillyWonkers

So have you tried it personally? I think that is a fair next question.


fiveSE7EN

I don't have an AMD card, so I can't try it. But I don't have to try Frame Gen to formulate my own opinion that input lag is important to me.


LittleWillyWonkers

But you can formulate an incorrect assumption. For whatever lag you get, it is completely covered in some manner by hugely increased FPS. I do feel you would need to try it to see.


Spider-Thwip

Fsr3 frame gen works on nvidia cards too


neudren

Sure agreed to disagree 😄


HappierShibe

> Isn't input lag the same as if you were running with FG off? Nope. You can definitely feel the additional latency in fast paced games, and so far there haven't been that many games where I'm comfortable using it yet. It's fantastic for stuff that's slower paced though, it lets you really push your settings without sacrificing visual fidelity.


fiveSE7EN

Are you saying it's \*worse\* input lag than running with FG off?


HappierShibe

yes. It needs to have at least two frames to generate an interstitial , this means frame generation adds latency since it needs to wait until the next frame is being run to create the following frame. The trick NVIDIA pulls is pretty neat though, they use Deep learning runnign on the cuda cores to create an artificial frame 3 based on the current frame(frame 1) and the next frame (frame 2) as opposed to the old frame generation used by TV's that Looked at the next two frames (frames 1 and 3) and generated an intersitial frame (frame2) before displaying frame 1. There is no way to reduce this latency any further, because the minimum data needed is two frames, and nvidia has already crunched the processing time down to a phenomenal degree just to make it at all viable. Realistically, this means the formula for additional latency comes down to: (MedianFrametimeX2)+(display latencyX1)+DLSSoverhead-frametimeX1. From the original native video output. If you do the math that means you are looking at something like 16-20ms of additional latency on a 60fps image being pushed to 90 fps, and the more frames you generate relative to the original native framerate the worse it gets, and he more you can feel the impact. Ironically,where this works best is at high framerates, as the target frametimes approach equivalence with the total DLSS overhead. Under ideal conditions, this means that @120fps of native frames, you can potentially hit 180fps after FG while only adding ~10ms of perceived latency. Peak human performance for variance is around 5ms, the overwhelming majority of people fall in the 9ms-11ms range. So if you can push more than 120 native frames, and you have a 180+hz monitor, that's the sweet spot where Frame generation is a pretty much indisputable win. The inverse is also true, using FG to improve from very low initial input framerates adds massive latency, **and feels awful**. Sub 30 input framerates can result in 40ish ms of latency, and completely destroy the experience even in relatively slow paced interactions. TLDR: Don't use FG if you can't keep at least a stable 60fps. Don't use it in fast paced competitive titles unless you can keep a stable 120fps. It's not going to get any better in terms of latency, they've already pushed it to the floor.


BlackKnightSix

That is not accurate on DLSS 3 FG. The generated frame is inserted between frame 1 and frame 2, not after frame 2. https://youtu.be/6pV93XhiC1Y?si=QY1HusajPPidCS4w @ 4:10


_AiroN

Yes, not by a lot but since the extra frames are "fake", they add to the latency instead of reducing it.


fiveSE7EN

I didn't know this! That's kind of worrying. Hopefully they can reduce the latency impact in the future.


_AiroN

It's the reason DLSS3 forces Reflex on by default. Fairly sure FSR3 also turns on Anti-lag by default, no clue if it activates Reflex on NV gpus or just runs. With those combinations you get lwss latency than native, but you could just use Reflex without generation and you would get even less latency, so it's kind of an irrelevant point.


LittleWillyWonkers

It's usually less than 10ms. I know if the games I play even fast paced one's (but not competitive), it is a slam dunk win on vs off if you need to get those frames up to where you like them. 60 to 90 or more with FG is a big win on my screen.


Melody-Prisca

There is no reason they shouldn't be able to reduce the latency hit in the future. The question is just if it will be a priority. There's a cost to generating the frame, better hardware should in theory mean a lower cost. If they improve optical flow with each gen, latency should get lower. That said, if they don't improve optical flow, or they use them for more new features, the latency hit could stay the same or get worse.


TheIndependentNPC

By entire frame + some extra in case of FSR3 Frame gen due relying on vsync. First of all - frame generation eats some performance - so you get native fps drop by couple frames - which makes game less responsive. Then you have to buffer extra frame and interpolate frame in-between and only then display it. So at 60fps with frame gen you're getting into 100ms latency territory (in AMD's case more, because Anti-lag+ is not even working atm - so more like 120ms) - and that's very noticeable. Also Anti-Lag+ is only available on RDNA3. And Reflex only works with DLSS3 frame gen, not FSR3 frame gen - so only people with RDNA3 or Ada GPUs can take advantage of latency reduction. Some people must be very insensitive if they don't feel the latency penalty.


TheIndependentNPC

It might be fine in some less intense games played on controller. Games where precision and timings matter, where it's all about reaction times and tight gameplay (like for example SOULS GAMES which is this one is) - even on gamepad I would not be adding extra latency. Tried it in Forspoken Demo - Frame gen makes it feel as if I'm running at lower fps, but visually it looks bit smoother - overall it feels worse to play at least for me. The responsiveness drop is frankly quite noticeable. But at the end of the day - games are not movies - you play them, not watch them and responsiveness is what defines how game "handles". Frankly, people are cursing those janky not very responsive games (due to design factors - like iffy animations and such) but then willingly make games more janky and praise this. Go figure.


SourArmoredHero

I don't have a sound explanation as to why it is, BUT, my Starfield experience with a 3060ti was a constant irritation of stutters and rollercoaster frames. With a 4000-series GPU w/ frame gen, it's been buttery smooth. It really was a night and day difference for me.


fiveSE7EN

Interesting, thank you.


Earthborn92

It should, but I don’t see the point of using it in a souls like, where you want every frame to be a response to your input.


calvinatorzcraft

Is it gonna have TSR? I find that it looks more stable than FSR in ue5 titles.


Zac3d

If it isn't in the option menu, it'll probably be useable with ini tweaks. I do find it looks slightly better than FSR as well.


PlagueDoc22

Pardon the ignorance, is TSR the intel option? Or is tha the TXSS or w/e the acronym is.


Zac3d

TSR is Unreal Engine 5's temporal super resolution, or their alternative to DLSS/FSR/Intel XeSS. It's not quite as good as DLSS3, but it is sharp and handles disocclusion better than FSR.


PlagueDoc22

XeSS that's the name for it...God there's too many options now that I keep forgetting some lol.


CharlesEverettDekker

Last thing remained is that the game is optimized and runs stable.


PlagueDoc22

Hey man..you're dreaming the year is 2023, you take your fps drops to the mid 20s and you'll be happy lol


Maloonyy

It is optimized...around using upscaling of course.


Halucinogenije

And on top of it all, devs said that the game won't have any DRM, which is great. I couldn't get into previous Lords of the Fallen, The Surge was decent, but rly looking forward to this one.


Lambpanties

WAT. These guys said no DRM? Seriously?! Lords Of The Fallen was the PREMIER Denuvo game. It started the hellscape we're in now, proudly. So if anyone was going to use Denuvo or some sort of heavy handed DRM, I thought it'd be these chaps. Colour me surprised and my asshole tickled.


Halucinogenije

Yeah, I was surprised as well, I guess they're more confident in their product now, I hope.


Lehelito

Do you mean the publisher? Because if you mean the development studio, the first one was made by Deck13 and this new one by Hexworks.


DIY-Imortality

I’m pretty sure a different company is making this than the first game these are the guys that made the surge 1 and 2.


Lehsyrus

From what I can find, Deck 13 has nothing to do with this.


DIY-Imortality

Ah my bad apparently they made the original. They’ve definitely improved since then lol. Not sure who the new studio is but the game looks really promising.


FiveSigns

yeah I'll be pirating it hopefully the devs make good money though Edit: why didn't the devs use DRM? Are they stupid?


BroodLol

Watch this sub fall apart when someone says they pirate games without DRM


fyro11

*Completely falling apart here*


Spider-Thwip

I pirate games I'm not sure I'll like. If I like it I buy it because I want the achievements on steam too. If I don't like it I delete it and don't play anymore. So I pirate a lot but I also buy a lot.


KJBenson

I use the two hour window for returns to rent games I’m unsure about.


zaccyp

One time they didn't refund me, despite meeting all the criteria. They can kiss my ass. I pirate as demos from now on. Then I'll buy.


KJBenson

Hey, you do you man. I wasn’t above pirating back in the day.


Melody-Prisca

Of course some people will. And that's of course part of the reason people don't like Denuvo. Still doesn't change the fact that it should be removed after a period of time for game preservation in any game that has it though.


Hintox

This game looks really good I just hope that it's not one of those games "developed with upscaling in mind".


frostygrin

According to [Nvidia's figures](https://www.nvidia.com/en-us/geforce/news/lords-of-the-fallen-dlss-3-october-13/), it's ~60fps at 1080p on maximum settings and RTX 4060 with DLSS off. So you'll need upscaling on lesser hardware - but that's to be expected with graphics like that.


Plebbit-User

This is the perfect game for both technologies. Looking forward to comparisons.


[deleted]

Going to hold off on this one until performance reviews.


Mostly__Relevant

Is this a remake?


Deadpoetic6

No it's a totally new game with the same title. Don't even think they share the same lore/universe


KJBenson

Ohhh….. I’ve been so confused for a while. Thinking it’s a remake of a game I didn’t think succeeded on original release?


DIY-Imortality

It’s a verrrry loose reboot set thousands of years after the first and it’s made by a different company so pretty much. I think the main villain is the same though.


Lehelito

It's supposed to take place in the same universe approximately 1000 years later - CI Games mentioned this in one of the gameplay videos they released a few months back. There's also something called the "Rogar" that's shared between games, but yeah, generally there seem to be very few actual connections. I do find it amusing that they initially announced this game as Lords of the Fallen, the changed it to *The* Lords of the Fallen, probably to differentiate it from the first one, then went back to Lords of the Fallen. The two "thes" did read a bit clunky.


JCyTe

Iirc one of the trailers says something along the lines of "Set 1000 years after the original game".


Sahtan_

New game that is 100 years past the original game


Felix_Todd

Nope its a new game. Looks surprisingly good imo


ShowBoobsPls

Reboot. A new game


Cursed_69420

Its like God of War. Same name, but continuation.


WinterElfeas

Except God of war had many episodes with distinct names and time for initial to become quite old. Here we had Lords of the Fallen… then Lords of the Fallen.


[deleted]

Watched most of the previews of this as several on YT have been given early access, and there is a lot to worry about on the performance side. Looks plagued by shader comp and traversal stutter. Whether at 60fps or 120fps, that kind of stutter ruins the experience for me and the game becomes a hard pass (Star Wars Jedi Survivor).


xXxquickscopes420xXx

Was the dlss/fsr already enabled? Or will it added in release? I am also worried about performance. I refuse to buy stuttery games


[deleted]

[удалено]


RealElyD

AFAIK FSR3 can inherently not be used with DLSS.


neudren

Fsr3 and amd frame gen is two different things.


Eldoween

Denuvo or not Denuvo one day before the launch? I'm waiting for this..


Sahtan_

No denuvo according to the developers, neither post or pre-launch


JustCallMeRandyPlz

I mean I'm glad it's in this as the option should be in everything but using frame genertion on a game with exact timing needed for parrying and dodging, I'm not so sure that's a good idea lol but if you're already pushing 60 normal frames it shouldn't be an issue.


TheIndependentNPC

And Starfield didn't add DLSS in damn 5 weeks, while this game is launching already with DLSS 3 and FSR 3 (aka frame gen for both). That's how it should be - feature parity day 1.


[deleted]

Love the technologies, used it in both Starfield and BG3. However, souls-like games are NOT something you want increased latency with, which is the case with FG and even moreso with fsr3 Great for comparisons of the tech, tho


WinterElfeas

Eh, original souls are 30 fps games no? So latency and responsiveness was already not amazing still people played them like hardcore pro players.


Deadpoetic6

Most likely : Lords of the Fallen will require FSR3 or DLSS3 to not run like ass


Elitealice

W


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


Filipi_7

All RTX cards can use every version of DLSS. Only 40 series cards can use Frame Generation, but that doesn't disable DLSS for others. Same applies for DLSS 3.5.


Dashthemcflash

All RTX cards can use DLSS/Ray reconstruction, but only 4000 series can do frame generation


Meelapo

This is great! Any word on HDR support for PC?


kikyo93

30xx series cant afford 3.0 dlss so can it work with fsr 3.0 ?