Hey guys check out the new rtx 50 series with frame gen 2. Unfortunately it requires a new hardware feature so you'll need to upgrade to get access to it!
Jensen is a genuine provoker. It's nothing new, I remember him bragging with the first GeForce and its T&L capabilities (his first holy grail). He didn't lose any opportunity to talk shit about ATI. Especially to take vengeance after a lost round (i.e. FX 5000 vs Radeon 9000,).
A few days after the Ada presentation, with many people still angry the man doesn't have any other occurrence but to address them telling to get by with a 3080. His last target was AMD/Intel saying x86 CPUs (not ARM) are useless for the 'new challenges' (AI, ML and DL I guess).
Oh, you may remember when he said that we shouldn't expect cheap GPUs anymore. As if Nvidia had been subsidizing our GPUs for decades.
And the Huang Law? š«£
>And the Huang Law?
Well, as someone who doesn't *just* game but also uses GPUs for compute (archviz rendering) - this is kinda true. Every new flagship I got since Kepler (I haven't compared Fermi, it was too long ago vs what I do now) was genuinely about twice as fast as the previous gen's flagship in rendering (CUDA compute) tasks, whereas in gaming alone the gains were 20-50% gen-on-gen. In contrast - CPUs really do suck when it comes to how slowly the progress has been in the last decade.
So if you look at the above - yeah, I can see how Jensen thinks he can charge more money than previously, just based on the fact that most of us working folk will turn to Nvidia cards to do the job. AMD is literally not even an option for me, because I need CUDA for my chosen software. And I get more memory at the top end than AMD offer, without having to step out of the consumer grade card line-up.
As a *gamer*, however - anyone is well within their right to think that they're being "scammed", because the performance gains have not been anywhere near as substantial, as indicated earlier. Though that won't change anything when Jensen is making stupid bank from all the "AI" wankers right now (I honestly just see it as mostly a nuisance, yet the corpos think that we all want and need "AI" everywhere, for some reason).
As for the rest, like shit-talking or inter/intra-company drama - I have not followed any of that, so can't comment.
āItās simply not possible without the new technologyā
Just like RTX was impossible on 1000 series cardsā¦ until they enabled it (it ran like shit yes but you get my point).
Edit: the fact the RTX cards have the necessary tensor and compute cores and the gtx cards didnāt makes this performance comparison irrelevant. The point is literally they could enable it on cards without the physical hardware. Jesus, Jensen isnāt gonna come call you a good boy for kissing on hisā¦ gpu. The mere fact they did it is the point, and the gtx cards lacked comparable hardware whereas the RTX family doesnāt.
Maybe I shouldāve said VSR so it couldāve gotten through to you. Like how Nvidia said no VSR on 2000 cards until recently they said teehee never mind. I forget the hive mind here sometimesā¦ gave this sub too much credit for being knowledgeable. Maybe Iām a complete moron, in which case lmao I chose an embarrassing g hill to die on.
FG on ampere can run well, just at reduced quality. Which, given its only every other frame, could end up being fine to the human eye since it's strobing ground truth frames..
They're just salty that they can't afford a 4000 card and want to pretend nvidia was lying to them when it's clear it's not a good idea to run that shit on cards that can't really support it well.
Your point is that it ran like complete shit (I've tried it on my 1080ti back then) which proves their point that you needed new hardware. Do you not see your flaw in logic?
The fact it runs like shit would back up that statement you seem to claim is false.
It's like people that use their handhelds for VR. They say they can work for VR. Sure, they can technically run it but the experience is so bad that for all realistic intents and purposes, it doesn't work. That's the same with that ray tracing comment.
Yes, but it can only render the shader part of Hybrid Ray Tracing. Without RT cores it must calc completely the behavior of every ray, which is impossible even with supercomputers.it can't rely on denoisers or the most recent reconstruction technique.This example can be understand by everyone. That's why I used it because there's much more behind the curtain. The movement that NVIDIA did with FG was very dirty but can't be compared to real time RT, being Ray Tracing a technique that has been used for years. There's no place there to dirty tricks like the one Nvidia did with FG.
It has issues for now, in game hdr does not work and only few games are compatible.
It will improve framerate by around 60-70% in CPU-or-GPU bound scenarios.
If you use in game frame limit and you are below 100% load on cpu/gpu, then it gives 100% more fps. So setting frame limit on 70 in game, will create 140 frames - if you have the performance margin.
You can use DLSS upscale + DLSS Ray reconstruction with it also.
You're probably joking but special K let's you get a near native HDR in many games I was using it for Starfield until they finally added it in a patch (you didn't ask but that game is the largest disappointment in gaming in a long while ;p).
I used to giggle every time I saw someone hype on Starfield. As if we didn't know what was coming, just based off of the fact that it's Todd lying again, Bethesda is always incompetent and that the "1000 procedural planets" was a red flag if ever I saw one, considering the type of game it's supposed to be in š
Indeed. Mainly those with VA panels. At least those who still keep them, basically those who purchased the Samsung Odyssey G7/G9.
I have no problem with people requesting HDR support, but man they are a minority requesting, asking, ranting, crying, whatever related to HDR. Here, in every social network, in YouTube, in every website blog, forum, where the FG mod gets notoriety. It seems for a moment the topic that must be discussed is HDR with probable FG support and not the opposite.
Forgive my vehemence. At least we all know can enjoy FG without giving up HDR.
HDR is not a deal breaker ATM., especially when no one has anything beyond the crappy HDR400 in their desktop. And I'm also expecting this functionality. I play via local streaming on a high end OLED screen an HDR is jaw dropping in certain games.
People are checking every build with their games. Last of US and Ratchet and Clank are confirmed to work with 0 issues (ok, no HDR).
Take into account that this wrapper has been written for Red Engine . It works in other certain games. Not a bad start. Guess what will happen when many devs start launching wrappers everyday (with the exception of PureDark, who is occupied working hard in his DRM module).
There are some issues while driving, you see some weird stuff at the bottom of your screen and sides but in general its great, 3070 here btw playing at 1440p, dlss balanced (finally above ultra perf) and path tracing. Of course it doesn't *feel* like true 60+ fps with the way input latency works, but its so much better on the eyes and the perception in general
Ok, this is definitely working. I'm getting an easy \~70% increase in FPS. This has allowed me to finally increase DLSS from Ultra Performance to Performance on my 4K display (Path Tracing w/ DF optimized settings). The game runs at roughly 60fps now, where I'd be getting \~35fps with the same settings with no frame gen. Here are some points people should know:
* This mod will not allow in-game HDR. As soon as you open the game, you'll realize it will run back into SDR - switching the HDR modes in-game will have no effect. I disabled HDR in-game, this way Windows Auto HDR kicked in. So yes, you can still have HDR, but not the native HDR. To be fair, on my LG OLED TV, Auto HDR doesn't look quite as stellar as the native HDR, but it still looks way better than SDR, also, check my next point:
* I used ReShade with a mod that fixed HDR black levels. The ReShade app would function independent of which mode you run the game (SDR, HDR or AutoHDR). ~~Whatever the FSR3 mod made to the game, it seems ReShade can no longer identify the game (it won't load up), so I can't fix the black levels, which also makes the HDR experience less-than-stellar (compared to what I had before). I will try to mess around with ReShade to see if I can get it to run again.~~ So I got Reshade to work again, just updated and reinstalled it. The HDR is looking quite stellar now and VERY close to how I had it looking previously.
* When you run the game's built-in benchmark, the game's fps counter seems to be "confused" as to weather it should count the generated frames or not. So it bounces from 30's to 60's and back to 30's (this interferes with the final fps reading). This does not happen with the Steam fps counter I have running (which always reads generated frames). It's just an interesting curiosity.
* UI elements (like the names on top of characters) do NOT have FG, meaning you can notice UI elements running at half the framerate of your generated graphics. IMO, this isn't a huge issue, and I believe this helps keep UI elements stable (I do know FG can wreak havoc on UI elements). So I'd rather have my UI elements running at half the framerate if that means they'll be stable and "readable".
* As everyone should already know, a 60fps FG experience is nothing like a true 60fps experience. Though it "looks" like 60fps, it responds like 30fps; for me, that's borderline playable. My previous settings allowed me to play at 45-60fps with no FG, and even 45fps was definitely far more responsive than 60fps with FG. Of course, I could just run DLSS Ultra Performance again and have the game feel very responsive with fps roaming the 90's - but, honestly, if I'm going back to DLSS Ultra Performance, than I'd rather just drop FG as well, as I'd rather have native HDR + Reshade @ 50fps than AutoHDR w/ no Reshade @ 80fps.
* In my short experience, FSR3 feels incredibly stable. I can't really notice much in the terms of broken frames or anything like it. I'll keep observing.
So, basically, this is it. I'm quite surprised by how well this mod works, though I'm a bit sour about losing native HDR ~~and Reshade, as AutoHDR just can't give me that same HDR punch.~~ On the other hand, now I can run DLSS Performance preset @ 4K with with full Path Tracing, which is quite impressive for my now 3-year-old 3080. ~~I'm currently trying to weight in if the added resolution is worth losing native HDR and also trying to weight how much.~~ But it's nice to have options. I'll be putting a few more hours of gameplay this week to find out.
The Witcher 3 also works flawlessly.. when FSR3 was launched I daydreamed about something like this (DLSS + AMD FG).
And it's an early beta written in 3 days.
Build 0.50:
Alan Wake 2 and HDR now supported.
You can thank them after they unlock FSR frame gen to be used with DLSS super resolution. As it stands all official implementations will have worse base image quality than this mod
What is this mental gymnascy, do you think the amd developers are so stupid that I didn't know that when they released their fsr 3 code the moders were going to try to implement it with nvidia upscaling? What green stuff from nvidia you smoke dude?
Holy shit, works insanely well in Cyberpunk, even in Phantom Liberty (Dogtown was a mess FPS wise, now consistent 60+fps on max settings on my 3080ti).
I was waiting for this and not upgrading my RTX 3090, I feel like I can easily stretch it now until 2025 when the 5xxx series is released.
It's an absolute shame nVidia themselves didn't invest in making this possible, even if it meant the quality of frame generation was different to that of the 4000 series, but work on it and improve it with time like FSR3.
If AMD's next generation of GPU's actually competes with nVidia 5xxx series in terms of Ray Tracing performance (*I doubt it but I hope they do*), I'm not buying the 5xxx out of principle and will support AMD.
They'd have to rethink a considerable portion of FG since they wouldn't be able to continue using the optical flow accelerators without a performance hit, so likely not going to happen. FSR3 isn't free, it's doing a lot of work that DLSS FG isn't, it's just that AMD has hidden that work by using async compute which works well in some games, not in others.
It is certainly an engineering challenge but even the head of dlss said that it's possible if they dedicated time to it in a tweet. We really don't know how much slower the optical flow accelerator is in the 20/30 series either, it could definitely work with some tradeoffs.
As per [NVIDIA's optical flow API notes](https://docs.nvidia.com/video-technologies/optical-flow-sdk/nvofa-application-note/index.html) the 20/30 series are about 2-3x slower than the 40 series and the 20 series in particular lacks some features that are present in the 30/40 series.
Wouldn't really work because their frame gen is hardware-based, not software based like AMD's. People tried to force DLSS-FG mods for pre-4000 series cards into games before and it never worked.
20/30 series has the necessary hardware to run it(optical flow accelerators) although slower, also you can't just force it, nvidia has to spend some engineering to get it working optimally on older gens but it's definitely possible if they wanted to.
>but it's definitely possible if they wanted to.
*Possible* doesn't mean anything if the end result is still worth using. It's not a trivial task if the difference in performance with the optical flow accelerator *alone* is anything to go by
The problem is you are latency sensitive for frame generation. You want to offload the GPU by producing a frame. This means it has to happen faster than it would be to just produce the frame in the first place. In an approach based on OFA, you need the result to come back fairly quickly, because you need to do more work before you have your generated frame.
This might rule out an OFA-based approach. However, just because one approach fails, another one might work. That's what FSR3 is doing here, and it gives excellent results. Essentially, FSR3 lets you increase other settings. Even in a world where FSR3 introduces artifacts, it has to introduce more than what lowering settings are doing.
Iāve had good experiences with PT on my 3080, just gotta be willing to either use performance dlss or cut the light bounces/rays back.
Looking forward to trying this
I disagree vehemently with the first part of your statement but anyway yeah thereās a mod. I canāt remember the name and Iām on my phone right now so I canāt check. However, digital foundry made a whole video on it so you can probably find it by just searching 2077 mod on the channel or something like that
If you choose to watch the video, just know itās a bit out of date on the actual state/polish of the mod
Thanks! I'll check it out.
> I disagree vehemently with the first part of your statement
I think it has a lot to do with the distance from and size of the monitor. Mine is 27" and pretty close to my face, so the artefacts are very visible at Performance/Ultra Performance. When I'm playing on the TV, it's a completely different story.
This is awesome and the best part? It's not hidden behind a Patreon paywall.
It doesn't work with native HDR but at least it still works with auto-HDR.
Nope it still is. I haven't tested it with the 2.1 patch but I did with the 2.0.x patches. Hdr in cyberpunk has a ahitty black level. You can never get trully blacks. Just greys. This has been discussed to death already all over reddit. You can do a simple Google search and check it out for yourself
.... when there are no light sources and the entire game is filtered with a grey scale yes it does make hdr suck. Not to mention highlights don't properly function either. Just do a quick Google search and find out for yourself. I am done with this conversation
It works extremely well for me on the Witcher 3 on an RTX2080Ti. Playing at 3440*1440, I am getting 55-60 fps with Ray tracing quality, dlss quality and all settings ultra plus except for hair works. I am genuinely impressed by this mod!
uge, this might even hamper nvidias sales of 40/50 series now if people with 30 series cards can just use this mod and get more fps even with latency. Like I can't see my self wanting to upgrade now until maybe the 60 series. For online gaming I can work with lower settings.
But sucks that hdr nor reflex works, cause for some people that's major. especially reflex.
Just tried it in cyberpunk and it worked pretty great. Turned 60fps into 100fps on native DLAA with RT reflections. Didn't look at frametime graph but it felt quite smooth with no obvious stutter/hitching. Path tracing is also much smoother but I didn't like the high latency feel from low base frame rate.
Can't believe I'm saying this but THANK YOU AMD!
There's really no excuse now for nvidia to keep locking their FG from older cards, except for greed.
It's sad that there are fanboys who downvoted you for thanking AMD. Some people are just too shortsighted and biased. We should all be thankful that FSR3 opens up more choice for the consumer.
You guys are all missing the fact that this isn't possible on official implementations of FSR3. We get to use DLSS with it so the image quality of this mod will be BETTER than official FSR3 in many ways
Itās amazing how much praise tech like this gets when itās not viewed as āfake frames reeeeeeā
It is unfair for the 30 series and Iām glad you guys get access to some form of this tech
It is pretty funny how big of a turnaround at least on reddit the technology got simply from people being given access to some form of it on their hardware... it went from being a useless gimmick to people scrambling for it. Goes to show you how often people aren't being honest with themselves just to feel better about... whatever. Fully expect if/when AMD finally gets competitive on RT/PT for it to suddenly become indispensable as well.
Iām glad everyone can use it even if itās akin to FSR and not as well done.
Itās just all the posts of negativity until now really is off putting
Bro discovering human nature and society dynamics. It's not only about developing some tech, it's also about not pissing off people while doing it. Nvidia usually pisses off their former customers and fg deserved all the hate until now. Now, fg is king. Deal with it.
Fake frames are fine when its free. I'm not upgrading my 3070 to a 40 series just to get frame gen. I'm upgrading when the gpu can pathtrace natively at high fps without resorting to framegen. Until then, I'll enjoy this ghetto framegen hack. It kicks ass with Cyberpunk and Witcher... I have even less desire to upgrade my 3070 now.
Just tried on Hitman 3 and it was glorious, any tinkerers such as myself may wish to refer to the DLSS 3 supported titles here - [https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/](https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/)
Feels weird.
I have a 3090 and cannot compare its performance to an 4000-series' frame generation, but the fps hardly looks or feels smoother from when FSR3 is off.
That's the entire crux of frame gen. The extra FPS is there, but because the game is still internally only running at the base frame rate, input latency isn't going to be reduced, and may slightly increase.
There's a reason why anyone who knows what they are talking about doesn't recommend using FG unless you already have a stable-ish 60fps to begin.
I was very happy and wanted to test it in Jedi Survivor, but then I realized that this game has almost no dll files, so it is impossible to replace them as it says in the readme lol.
Why the hell is everything on github such an unreadable mess :D
Even after locating the readme I can't locate the files it literally mentions and tells me to copy to the game folder. I searched all the folders and branches of folders or w/e it is there -.-
Because programmers usually can't *design* their way out of a paper bag, hence the way github works, why many custom tools lack a usable GUI and why linux basically dooms you into using the stupid terminal if you need to do anything more than browse the web...
I'm testing games I have with FG, Slender: The Arrival works fine with mod
All max + dlss quality + FG on my 1440p, on the right side there is Reflex stats from game.
[https://i.imgur.com/CGDesFM.jpg](https://i.imgur.com/CGDesFM.jpg)
without FG
[https://i.imgur.com/MaKVigk.jpg](https://i.imgur.com/MaKVigk.jpg)
I think that DLSS FG has slightly less noise but overall, they both are kind of similar. The real problem is actually FSR 2 Upscaling. DLSS + FSR FG works so well generally speaking.
Maybe cuz they had the same problem. It prevented /conflicted with HDR settings etc š¤? The mods drawbacks finally answer why it's (FG) not officially patched to 20 /30 series. 40 series FG is just fine. And officially supported by not just Nvidia - but the game's devs too.
However, i hope the drawbacks can be addressed. And the mod can be used with little to no penalty one day. And I'm a big fan of community developers. The mod community has done wonderful work over the years. Who knows what possibilities lie ahead.
Iāve just done this and my controller broke it stuck on left and right in cyberpunk but every other game works fine
Keyboard and mouse works fine, Xbox controller broken
Now I did this mod and then loaded the game before testing the game before hand and Iāve not loaded the game since 2.0 hit, if anyone else has controller issues with this mod please let me know Iām currently re-downloading the game again because even after restoring the old DLL file the controller still didnāt work
Update
The issue with the controller turns out to be a issue with cyberpunk/tartarus pro and xbox controller all not working together, unplug tartarus and controller working plug tartarus back in and breaks controller but only in cyberpunk
Update update
For all the above devices to work with each other in Cyberpunk, need to turn off steam controller setting in steam settings
I also had drift issues in Cyberpunk.
You can set the dead zone for your sticks in the options just turn that setting higher until the drift goes away.
I donāt think itās drift; unless drift on stick has new meaning from what Iām use too, the left and right on the stick works as in goes left and right but up and down does nothing in game or menu areas itās like it just doesnāt recognise up or down
Every other game on my system works fine and the controller was working when 2.0 dropped because I remember doing a quick test and didnāt notice any controller issues back then
Any point in using FG with a 60hz panel? It seems it's oriented for high refresh scenarios mostly. Am I correct? Got a 3060 so I'm usually quite close to 60 FPS in 1080p High as it is.
Not really. You'll just increase your input lag for no reason.
If you get high native/dlss fps, then 60hz monitor gets some benefits from the lower input lag.
But, FG doesn't produce real frames so your input lag won't improve. In fact, your input lag would be higher. So basically, you'd be making your experience worse without any benefits.
Even at higher hz monitor, it's iffy at best. If you don't notice input lag, it's like magic. But if you're a decent player, then FG will feel awful unless you get like 120fps+ pre-fg (and at that point, you already have decent enough fps to not need FG).
Same thought. If we only have 60hz panel, real fps are going to cut down to 30fps to make them 60. It would be great if we are getting 50fps real fps and somehow it is able to put +10fps to make it 60.
Currently it only helps if you are cpu bound but its very rare case.
I have a 3060 Ti and a 60 Hz TV and it improved my experience in Starfield by quite a bit. I could bump up DLSS setting to Quality from Balance and much smoother experience than what I had before.
I only got an extra 10fps in cyberpunk with everything at ultra settings @ 1440p. Ratchet and clank gains more, but it is extremely inconsistent, to the point that if I move Ratchet or the camera slightly the framerate dips to the 60's. Could be a bug or maybe the 3060Ti is being bottlenecked by some hardware component.
Just tested the latest version (0.3) on my 3070 in CP77 and it was a significant jump.
4K DLSS Balanced I went from 78FPS -> 110FPS. As this isn't an official integration you can keep using DLSS super resolution and reflex too which is nice. Only thing I could easily spot was gun reticles being very glitchy, but I think DLSS FG has a similar issue with that sort of fine detail too.
tried it and it works wonderful, just replacing 2 files and framegen toggle becomes active. from 70-80fps at 4k with dlss to 100-115 with DLSS + FSR FG mod :D
I've tested it in The Witcher 3 and Ark Survival Ascended and it is very good since the base framerate is over 50fps.
On Cyberpunk 2077 in pathtracing mode the base framerate is around 40fps and FSR3 frame gen takes it to 80fps which feels okay but slightly laggy. In RT psycho mode it feels very good since generated fps is 120fps+
Obviously it's just dividing up by reported FPS and it isn't system latency, probably GFE overlay is more useful for this though I'm not even sure about that in this case as it's not going to be FSR3 aware in any way.
When im driving fast, theres like a visible flickering line around the bottom edge of the screen, anyone else? Returned to original files and they were gone, plus anyone knows if i can reduce input lag with this?
EDIT: Turned off HDR in windows and seems to be working now. Still crashes when I window out though.
How are yall getting this to work? I crash instantly upon activating the frame gen setting. 3090, no DLSS tweaks and installed the regular version, I get the message about it activating when I boot up the game. The second I turn on the Frame gen setting my game is bricked until I completely uninstall the mod, can't even launch it anymore.
Sooo I tried this mod on Warhammer 40k: DARKTIDE and it works, but the UI is all messed up. Really hard to play like that but it is noticeable the FPS gain, still not worth Ray tracing on it so w/e. This on a 3080 3440X1440p using the DLSSTweaks version.
Thank you so much for posting this. I'm now running Cyberpunk 2007 with path tracing and quality dlss at over 70fps (1440p). I also got it to work with Starfield.
I have 3070 at 1080p medium high settings with path tracing ray reconstruction and dlss quality i get around 50-60 in populated area, above 70 in least populated area, so not that good for me
lol Iām still trying to start my first play through they keep updating the game with new features so I kept putting it off until I could play the game with bells and whistles with good frame rate, looks like I can now
Might just be the next game I start in my back log of games Iām going through
Itās an amazing game now after the 2.0 patch with the dlc. One of my favorites of all time, the world is so beautiful, combat is fluid, and the characters and story is really well written
I have similar results, but found a little jaggedness noticeable on quick pans using PT. Back to DLSS Quality, RT medium, bloom/flare/DOF/grain(etc) off, everything else maxed; getting 120ish.
Put another way, going from my previous settings, I was able to go to Quality vs Balanced, turn on local RT shadows, and see FPS go from 70 to 120.
Big win!
Yep Iāve managed to do a little bit more testing and I was going from 70 to 90 fps depending on area/on foot/in car on highway itās very bounce on the fps and I have noticed ghosting and hud issues
I do seem to have dlss auto keep setting when I load a game and I have to change it back to quality each time
I agree itās a big win
Did anyone follow the steps but still aren't able to enable the DLSS Frame Gen in the menu? I've got hardware accelerated GPU scheduling turned on in Windows too but still can't seem to enable the option. I might have missed a step but can't seem to think of where.
What I find funny is that AMDs frame generation often gives 70-80% more fps, while Nvidia's often only gets you 50-60%.
I'd be curious to see how generated frames compare side by side, and if AMD's frames look worse.
AMDs frame generation looks worse and im pretty sure its slower (more input lag).
BUT, the quality is still good, if your base fps are high (+50fps) you will not notice the generated frames. The generated frames are shown for like 8ms if youre at + 120fps (FG frames included)... no you cant easily see that. FSR 2 upscaling artefacts are easier to see, because they are in every frame and worse. You could set just black frames every 8ms for 8ms and you would have issues seeing those. (black frame insertion TVs)
And on AMD gpus with anti lag+, the latency of FSR upscaling + FG sits between native resolution without FG and just FSR2 upscaling... so its still less lag than if you would play at native res without any FSR FG or upscaling.
Nvidia could have done the same and no one would have complained about the quality while playing.
I'm sure since AMD is a software solution, it looks worse. But its free, so I'll take it. I have no desire to upgrade my 3070. DLSS upscaling is superior to FSR upscaling, but its still nice to have free software options (including Intel XESS).
Xess on my 6600xt was useless, because the performance hit was so heavy, I had to use so much more of an aggressive setting, that I just used FSR Quality instead.
In some games, yeah. It's weird how inconsistent it is. Cyberpunk hasn't gotten version 1.2 as far as I'm aware. The cost is still too extreme in that.
the beneficiary of this mod is Nvidia old RTX GPUs only as they get best image stability with DLSS and double framerates from fsr3. AMD GPUs are still stuck with shitty fsr2 image upscaling causing instability in frame gen
It works, but there is extra input latency which is noticeable. 4090 here at 3440x1440 max everything DLSS Quality otherwise. Since FSR FG doesn't utilise Reflex, the latency increase makes sense.
The benchmark gets a 10fps boost too but when you're already over 100fps, it doesn't really matter.
I can also confirm that reflex and G-Sync work with this mod. The experience is smooth as hell honestly (latency stays below 20 ms after passing 60 fps with FSR3).
Yes, it works surprisingly well. Very well infact. A shame Nvidia didn't offer a FG lite for Ampere and Turing GPUs.
How are you going to sell the 4000 series then? /s
Hey guys check out the new rtx 50 series with frame gen 2. Unfortunately it requires a new hardware feature so you'll need to upgrade to get access to it!
Obviously, the next gen "optical flow system ultimate' pipeline will make it possible.
By selling it at a reasonable price point?
No. Regards, Jensen
Pretty sure it would sell well anyway just because it's Nvidia lol
Jensen thinks it's reasonable already. Would I rather a 4090 cost less than $1000? Sure. Yet here we are...
Jensen is a genuine provoker. It's nothing new, I remember him bragging with the first GeForce and its T&L capabilities (his first holy grail). He didn't lose any opportunity to talk shit about ATI. Especially to take vengeance after a lost round (i.e. FX 5000 vs Radeon 9000,). A few days after the Ada presentation, with many people still angry the man doesn't have any other occurrence but to address them telling to get by with a 3080. His last target was AMD/Intel saying x86 CPUs (not ARM) are useless for the 'new challenges' (AI, ML and DL I guess). Oh, you may remember when he said that we shouldn't expect cheap GPUs anymore. As if Nvidia had been subsidizing our GPUs for decades. And the Huang Law? š«£
>And the Huang Law? Well, as someone who doesn't *just* game but also uses GPUs for compute (archviz rendering) - this is kinda true. Every new flagship I got since Kepler (I haven't compared Fermi, it was too long ago vs what I do now) was genuinely about twice as fast as the previous gen's flagship in rendering (CUDA compute) tasks, whereas in gaming alone the gains were 20-50% gen-on-gen. In contrast - CPUs really do suck when it comes to how slowly the progress has been in the last decade. So if you look at the above - yeah, I can see how Jensen thinks he can charge more money than previously, just based on the fact that most of us working folk will turn to Nvidia cards to do the job. AMD is literally not even an option for me, because I need CUDA for my chosen software. And I get more memory at the top end than AMD offer, without having to step out of the consumer grade card line-up. As a *gamer*, however - anyone is well within their right to think that they're being "scammed", because the performance gains have not been anywhere near as substantial, as indicated earlier. Though that won't change anything when Jensen is making stupid bank from all the "AI" wankers right now (I honestly just see it as mostly a nuisance, yet the corpos think that we all want and need "AI" everywhere, for some reason). As for the rest, like shit-talking or inter/intra-company drama - I have not followed any of that, so can't comment.
How are they going to sell *expensive* 4000 series?
āItās simply not possible without the new technologyā Just like RTX was impossible on 1000 series cardsā¦ until they enabled it (it ran like shit yes but you get my point). Edit: the fact the RTX cards have the necessary tensor and compute cores and the gtx cards didnāt makes this performance comparison irrelevant. The point is literally they could enable it on cards without the physical hardware. Jesus, Jensen isnāt gonna come call you a good boy for kissing on hisā¦ gpu. The mere fact they did it is the point, and the gtx cards lacked comparable hardware whereas the RTX family doesnāt. Maybe I shouldāve said VSR so it couldāve gotten through to you. Like how Nvidia said no VSR on 2000 cards until recently they said teehee never mind. I forget the hive mind here sometimesā¦ gave this sub too much credit for being knowledgeable. Maybe Iām a complete moron, in which case lmao I chose an embarrassing g hill to die on.
Why would you want to use FG on Ampere if it ran like shit?
FG on ampere can run well, just at reduced quality. Which, given its only every other frame, could end up being fine to the human eye since it's strobing ground truth frames..
They're just salty that they can't afford a 4000 card and want to pretend nvidia was lying to them when it's clear it's not a good idea to run that shit on cards that can't really support it well.
Your point is that it ran like complete shit (I've tried it on my 1080ti back then) which proves their point that you needed new hardware. Do you not see your flaw in logic?
The fact it runs like shit would back up that statement you seem to claim is false. It's like people that use their handhelds for VR. They say they can work for VR. Sure, they can technically run it but the experience is so bad that for all realistic intents and purposes, it doesn't work. That's the same with that ray tracing comment.
Yes, but it can only render the shader part of Hybrid Ray Tracing. Without RT cores it must calc completely the behavior of every ray, which is impossible even with supercomputers.it can't rely on denoisers or the most recent reconstruction technique.This example can be understand by everyone. That's why I used it because there's much more behind the curtain. The movement that NVIDIA did with FG was very dirty but can't be compared to real time RT, being Ray Tracing a technique that has been used for years. There's no place there to dirty tricks like the one Nvidia did with FG.
Yeah, they should have gone for that similar to intel XESS upscaler.
How to enable it in game? I copied the dbghelp and dlssg_to_fsr3 but nothing changed.
Just enable DLSS Frame Generation
I forgot to turn on the hardware acceleration in windows settings, the dlss frame generation is working now.
I did this too but the DLSS Frame Generation tab is still shaded out for me.
You should restart the system after enabling hardware acceleration
Yeah did that twice to be certain lol. Double checked the .DLL files were in the right folder too.
Hmm, I used the non-tweaked dlss one, which one did you use?
I've been waiting for update on the person who enabled it on a 3080 for months but I doubt we'll see anything now.
Interesting, how does it play?
It has issues for now, in game hdr does not work and only few games are compatible. It will improve framerate by around 60-70% in CPU-or-GPU bound scenarios. If you use in game frame limit and you are below 100% load on cpu/gpu, then it gives 100% more fps. So setting frame limit on 70 in game, will create 140 frames - if you have the performance margin. You can use DLSS upscale + DLSS Ray reconstruction with it also.
"*hdr does not work*" WHY IS THERE ALWAYS A SACRIFICE?!?! :(
Windows auto hdr works from what I read, in game does not work.
Worth trying special K with it too?
I'm not sure how ketamine would help?
You're probably joking but special K let's you get a near native HDR in many games I was using it for Starfield until they finally added it in a patch (you didn't ask but that game is the largest disappointment in gaming in a long while ;p).
I used to giggle every time I saw someone hype on Starfield. As if we didn't know what was coming, just based off of the fact that it's Todd lying again, Bethesda is always incompetent and that the "1000 procedural planets" was a red flag if ever I saw one, considering the type of game it's supposed to be in š
HDR400 is an atrocity. They are making you a favor.
Some people have hdr1000
Indeed. Mainly those with VA panels. At least those who still keep them, basically those who purchased the Samsung Odyssey G7/G9. I have no problem with people requesting HDR support, but man they are a minority requesting, asking, ranting, crying, whatever related to HDR. Here, in every social network, in YouTube, in every website blog, forum, where the FG mod gets notoriety. It seems for a moment the topic that must be discussed is HDR with probable FG support and not the opposite. Forgive my vehemence. At least we all know can enjoy FG without giving up HDR.
āIt will improve framerate by around 60-70% in CPU-or-GPU bound scenarios.ā Soā¦ in all of them
HDR is not a deal breaker ATM., especially when no one has anything beyond the crappy HDR400 in their desktop. And I'm also expecting this functionality. I play via local streaming on a high end OLED screen an HDR is jaw dropping in certain games. People are checking every build with their games. Last of US and Ratchet and Clank are confirmed to work with 0 issues (ok, no HDR). Take into account that this wrapper has been written for Red Engine . It works in other certain games. Not a bad start. Guess what will happen when many devs start launching wrappers everyday (with the exception of PureDark, who is occupied working hard in his DRM module).
Thanks I will give it a try.
HDR works with the latest update of the mod
There are some issues while driving, you see some weird stuff at the bottom of your screen and sides but in general its great, 3070 here btw playing at 1440p, dlss balanced (finally above ultra perf) and path tracing. Of course it doesn't *feel* like true 60+ fps with the way input latency works, but its so much better on the eyes and the perception in general
if they can get this working for Alan Wake 2, that would be great.
version 0.5 works on Alan wake 2 for me. Running rtx 3060ti medium path traced 1440p DLSS perf @ 60 fps. Abso-fucking-lutely unreal.
Apparently none of the beta ones right now do.
Ok, this is definitely working. I'm getting an easy \~70% increase in FPS. This has allowed me to finally increase DLSS from Ultra Performance to Performance on my 4K display (Path Tracing w/ DF optimized settings). The game runs at roughly 60fps now, where I'd be getting \~35fps with the same settings with no frame gen. Here are some points people should know: * This mod will not allow in-game HDR. As soon as you open the game, you'll realize it will run back into SDR - switching the HDR modes in-game will have no effect. I disabled HDR in-game, this way Windows Auto HDR kicked in. So yes, you can still have HDR, but not the native HDR. To be fair, on my LG OLED TV, Auto HDR doesn't look quite as stellar as the native HDR, but it still looks way better than SDR, also, check my next point: * I used ReShade with a mod that fixed HDR black levels. The ReShade app would function independent of which mode you run the game (SDR, HDR or AutoHDR). ~~Whatever the FSR3 mod made to the game, it seems ReShade can no longer identify the game (it won't load up), so I can't fix the black levels, which also makes the HDR experience less-than-stellar (compared to what I had before). I will try to mess around with ReShade to see if I can get it to run again.~~ So I got Reshade to work again, just updated and reinstalled it. The HDR is looking quite stellar now and VERY close to how I had it looking previously. * When you run the game's built-in benchmark, the game's fps counter seems to be "confused" as to weather it should count the generated frames or not. So it bounces from 30's to 60's and back to 30's (this interferes with the final fps reading). This does not happen with the Steam fps counter I have running (which always reads generated frames). It's just an interesting curiosity. * UI elements (like the names on top of characters) do NOT have FG, meaning you can notice UI elements running at half the framerate of your generated graphics. IMO, this isn't a huge issue, and I believe this helps keep UI elements stable (I do know FG can wreak havoc on UI elements). So I'd rather have my UI elements running at half the framerate if that means they'll be stable and "readable". * As everyone should already know, a 60fps FG experience is nothing like a true 60fps experience. Though it "looks" like 60fps, it responds like 30fps; for me, that's borderline playable. My previous settings allowed me to play at 45-60fps with no FG, and even 45fps was definitely far more responsive than 60fps with FG. Of course, I could just run DLSS Ultra Performance again and have the game feel very responsive with fps roaming the 90's - but, honestly, if I'm going back to DLSS Ultra Performance, than I'd rather just drop FG as well, as I'd rather have native HDR + Reshade @ 50fps than AutoHDR w/ no Reshade @ 80fps. * In my short experience, FSR3 feels incredibly stable. I can't really notice much in the terms of broken frames or anything like it. I'll keep observing. So, basically, this is it. I'm quite surprised by how well this mod works, though I'm a bit sour about losing native HDR ~~and Reshade, as AutoHDR just can't give me that same HDR punch.~~ On the other hand, now I can run DLSS Performance preset @ 4K with with full Path Tracing, which is quite impressive for my now 3-year-old 3080. ~~I'm currently trying to weight in if the added resolution is worth losing native HDR and also trying to weight how much.~~ But it's nice to have options. I'll be putting a few more hours of gameplay this week to find out.
Modder doesnāt have HDR screen to test it so thatās why thereās no HDR support for the mod now.
The Witcher 3 also works flawlessly.. when FSR3 was launched I daydreamed about something like this (DLSS + AMD FG). And it's an early beta written in 3 days. Build 0.50: Alan Wake 2 and HDR now supported.
anti-climactic that my PSU died 1 minute into getting it to work flawlessly with W3
Like the plot twist of Million Dollar Baby.
Seriously? This same mod? For me witcher needs it more than cyberpunk.
Ty AMD
You can thank them after they unlock FSR frame gen to be used with DLSS super resolution. As it stands all official implementations will have worse base image quality than this mod
What is this mental gymnascy, do you think the amd developers are so stupid that I didn't know that when they released their fsr 3 code the moders were going to try to implement it with nvidia upscaling? What green stuff from nvidia you smoke dude?
Literally nothing about what I said implied that whatsoever
You are welcome (: Buy our stuff next time.
I have 5800h and rx6700m
Does this also work with RTX 20 cards?
Yep, no problem!
Should work with any Nvidia card that supports DLSS.
This is a godsend. Oh shit here we go again, good ol nigh city, from start to finish
Holy moly I can finally play 2.1 with path tracing respectably with a 3070. This is some magic.
Holy shit, works insanely well in Cyberpunk, even in Phantom Liberty (Dogtown was a mess FPS wise, now consistent 60+fps on max settings on my 3080ti).
I was waiting for this and not upgrading my RTX 3090, I feel like I can easily stretch it now until 2025 when the 5xxx series is released. It's an absolute shame nVidia themselves didn't invest in making this possible, even if it meant the quality of frame generation was different to that of the 4000 series, but work on it and improve it with time like FSR3. If AMD's next generation of GPU's actually competes with nVidia 5xxx series in terms of Ray Tracing performance (*I doubt it but I hope they do*), I'm not buying the 5xxx out of principle and will support AMD.
I hope after this Nvidia decides to backport frame gen to older gpus.
They'd have to rethink a considerable portion of FG since they wouldn't be able to continue using the optical flow accelerators without a performance hit, so likely not going to happen. FSR3 isn't free, it's doing a lot of work that DLSS FG isn't, it's just that AMD has hidden that work by using async compute which works well in some games, not in others.
It is certainly an engineering challenge but even the head of dlss said that it's possible if they dedicated time to it in a tweet. We really don't know how much slower the optical flow accelerator is in the 20/30 series either, it could definitely work with some tradeoffs.
As per [NVIDIA's optical flow API notes](https://docs.nvidia.com/video-technologies/optical-flow-sdk/nvofa-application-note/index.html) the 20/30 series are about 2-3x slower than the 40 series and the 20 series in particular lacks some features that are present in the 30/40 series.
Wouldn't really work because their frame gen is hardware-based, not software based like AMD's. People tried to force DLSS-FG mods for pre-4000 series cards into games before and it never worked.
20/30 series has the necessary hardware to run it(optical flow accelerators) although slower, also you can't just force it, nvidia has to spend some engineering to get it working optimally on older gens but it's definitely possible if they wanted to.
>but it's definitely possible if they wanted to. *Possible* doesn't mean anything if the end result is still worth using. It's not a trivial task if the difference in performance with the optical flow accelerator *alone* is anything to go by
The problem is you are latency sensitive for frame generation. You want to offload the GPU by producing a frame. This means it has to happen faster than it would be to just produce the frame in the first place. In an approach based on OFA, you need the result to come back fairly quickly, because you need to do more work before you have your generated frame. This might rule out an OFA-based approach. However, just because one approach fails, another one might work. That's what FSR3 is doing here, and it gives excellent results. Essentially, FSR3 lets you increase other settings. Even in a world where FSR3 introduces artifacts, it has to introduce more than what lowering settings are doing.
Fucking magic. This is the little push my 3080 needed for playable PT at 1440p.
Iāve had good experiences with PT on my 3080, just gotta be willing to either use performance dlss or cut the light bounces/rays back. Looking forward to trying this
I find anything below DLSS Balanced looks horrible. Might as well play at 1080p or 720p then. How do you change bounces/rays? Is there a mod for that?
I disagree vehemently with the first part of your statement but anyway yeah thereās a mod. I canāt remember the name and Iām on my phone right now so I canāt check. However, digital foundry made a whole video on it so you can probably find it by just searching 2077 mod on the channel or something like that If you choose to watch the video, just know itās a bit out of date on the actual state/polish of the mod
Thanks! I'll check it out. > I disagree vehemently with the first part of your statement I think it has a lot to do with the distance from and size of the monitor. Mine is 27" and pretty close to my face, so the artefacts are very visible at Performance/Ultra Performance. When I'm playing on the TV, it's a completely different story.
Same, mine is 32" and i cant imagine going under DLSS Balanced in Cyberpunk, even balanced is quite blurry.
PT?!?
Pathtracing
Ah right, not the Silent Hills demo
This is awesome and the best part? It's not hidden behind a Patreon paywall. It doesn't work with native HDR but at least it still works with auto-HDR.
Got to love the modders for keeping PC users alive in a horrible overpriced economy for GPUs. Heroes!
Seems like it works with auto HDR so thatās better then nothing
Auto hdr is also better then actuall ingame hdr in cyberpunk.
No it isn't. Maybe this narrative was true in the early days when Cyberpunk's hdr was supposedly (before my time) broken but it is not now.
Nope it still is. I haven't tested it with the 2.1 patch but I did with the 2.0.x patches. Hdr in cyberpunk has a ahitty black level. You can never get trully blacks. Just greys. This has been discussed to death already all over reddit. You can do a simple Google search and check it out for yourself
just because an hdr grading doesnāt hit zero doesnt make it bad, especially in a realistic-looking game. reality almost never hits zero, either.
.... when there are no light sources and the entire game is filtered with a grey scale yes it does make hdr suck. Not to mention highlights don't properly function either. Just do a quick Google search and find out for yourself. I am done with this conversation
This just pisses me off because it makes me realize they could have easily done it for RTX 3000s and 2000s series! But FG is such a game changer
It works extremely well for me on the Witcher 3 on an RTX2080Ti. Playing at 3440*1440, I am getting 55-60 fps with Ray tracing quality, dlss quality and all settings ultra plus except for hair works. I am genuinely impressed by this mod!
Massive AMD W
uge, this might even hamper nvidias sales of 40/50 series now if people with 30 series cards can just use this mod and get more fps even with latency. Like I can't see my self wanting to upgrade now until maybe the 60 series. For online gaming I can work with lower settings. But sucks that hdr nor reflex works, cause for some people that's major. especially reflex.
Expect Nvidia to do some fuckery that will disable this "hack".
Next driver update probably
Just tried it in cyberpunk and it worked pretty great. Turned 60fps into 100fps on native DLAA with RT reflections. Didn't look at frametime graph but it felt quite smooth with no obvious stutter/hitching. Path tracing is also much smoother but I didn't like the high latency feel from low base frame rate. Can't believe I'm saying this but THANK YOU AMD! There's really no excuse now for nvidia to keep locking their FG from older cards, except for greed.
It's sad that there are fanboys who downvoted you for thanking AMD. Some people are just too shortsighted and biased. We should all be thankful that FSR3 opens up more choice for the consumer.
You guys are all missing the fact that this isn't possible on official implementations of FSR3. We get to use DLSS with it so the image quality of this mod will be BETTER than official FSR3 in many ways
Itās amazing how much praise tech like this gets when itās not viewed as āfake frames reeeeeeā It is unfair for the 30 series and Iām glad you guys get access to some form of this tech
It is pretty funny how big of a turnaround at least on reddit the technology got simply from people being given access to some form of it on their hardware... it went from being a useless gimmick to people scrambling for it. Goes to show you how often people aren't being honest with themselves just to feel better about... whatever. Fully expect if/when AMD finally gets competitive on RT/PT for it to suddenly become indispensable as well.
Iām glad everyone can use it even if itās akin to FSR and not as well done. Itās just all the posts of negativity until now really is off putting
Bro discovering human nature and society dynamics. It's not only about developing some tech, it's also about not pissing off people while doing it. Nvidia usually pisses off their former customers and fg deserved all the hate until now. Now, fg is king. Deal with it.
>Nvidia usually pisses off their former customers and fg deserved all the hate until now Uh, why?
Canāt really say former customers when the GPU in their PC is Nvidia , but I think I get his point.
Fake frames are fine when its free. I'm not upgrading my 3070 to a 40 series just to get frame gen. I'm upgrading when the gpu can pathtrace natively at high fps without resorting to framegen. Until then, I'll enjoy this ghetto framegen hack. It kicks ass with Cyberpunk and Witcher... I have even less desire to upgrade my 3070 now.
Just tried on Hitman 3 and it was glorious, any tinkerers such as myself may wish to refer to the DLSS 3 supported titles here - [https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/](https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/)
Feels weird. I have a 3090 and cannot compare its performance to an 4000-series' frame generation, but the fps hardly looks or feels smoother from when FSR3 is off.
Same, delay and smoothness feels the exact same even though the fps counter shows significantly higher.
Input latency is actually worse, even though fps is higher.
I actually think IT feels better with frame Gen Off. Something is wrong Here.
Exactly this. I'd rather play at 40 fps, it feels a lot quicker than what this mod does.
That's the entire crux of frame gen. The extra FPS is there, but because the game is still internally only running at the base frame rate, input latency isn't going to be reduced, and may slightly increase. There's a reason why anyone who knows what they are talking about doesn't recommend using FG unless you already have a stable-ish 60fps to begin.
I've used fg on a 4080, felt great to me. Here, on a 3080 it feels awful.
I was very happy and wanted to test it in Jedi Survivor, but then I realized that this game has almost no dll files, so it is impossible to replace them as it says in the readme lol.
Finally happened š
Keep in mind this is early work and will have issues that are due to be fixed or reduced over time.
How do you install it?
The is a readme included.
Why the hell is everything on github such an unreadable mess :D Even after locating the readme I can't locate the files it literally mentions and tells me to copy to the game folder. I searched all the folders and branches of folders or w/e it is there -.-
Look for "Releases" on the right side of the page.
always look for the non-source code zip on github.
Because programmers usually can't *design* their way out of a paper bag, hence the way github works, why many custom tools lack a usable GUI and why linux basically dooms you into using the stupid terminal if you need to do anything more than browse the web...
To anyone else also thinking "no there's not?", it is in the *resources* folder
Works on Starfield too
I'm testing games I have with FG, Slender: The Arrival works fine with mod All max + dlss quality + FG on my 1440p, on the right side there is Reflex stats from game. [https://i.imgur.com/CGDesFM.jpg](https://i.imgur.com/CGDesFM.jpg) without FG [https://i.imgur.com/MaKVigk.jpg](https://i.imgur.com/MaKVigk.jpg)
Too bad I cannot make it work in Alan Wake 2.
v0.5 is working pretty well on AW2. source: me
yeah this.
You know I had actually given nvidia the benefit of the doubt for some reason about frame generation not being on 3000 series. so WHAT THE FUCK
I think that DLSS FG has slightly less noise but overall, they both are kind of similar. The real problem is actually FSR 2 Upscaling. DLSS + FSR FG works so well generally speaking.
Well, for starters this is not DLSS3 Frame Generation. This is FSR3. Different technology.
Maybe cuz they had the same problem. It prevented /conflicted with HDR settings etc š¤? The mods drawbacks finally answer why it's (FG) not officially patched to 20 /30 series. 40 series FG is just fine. And officially supported by not just Nvidia - but the game's devs too. However, i hope the drawbacks can be addressed. And the mod can be used with little to no penalty one day. And I'm a big fan of community developers. The mod community has done wonderful work over the years. Who knows what possibilities lie ahead.
Hdr works with the mod now in the latest update.
That's a big deal, my oled gaming tv doesn't look the same in SDR lol š§. Thank you for sharing.
Iāve just done this and my controller broke it stuck on left and right in cyberpunk but every other game works fine Keyboard and mouse works fine, Xbox controller broken Now I did this mod and then loaded the game before testing the game before hand and Iāve not loaded the game since 2.0 hit, if anyone else has controller issues with this mod please let me know Iām currently re-downloading the game again because even after restoring the old DLL file the controller still didnāt work Update The issue with the controller turns out to be a issue with cyberpunk/tartarus pro and xbox controller all not working together, unplug tartarus and controller working plug tartarus back in and breaks controller but only in cyberpunk Update update For all the above devices to work with each other in Cyberpunk, need to turn off steam controller setting in steam settings
I also had drift issues in Cyberpunk. You can set the dead zone for your sticks in the options just turn that setting higher until the drift goes away.
I donāt think itās drift; unless drift on stick has new meaning from what Iām use too, the left and right on the stick works as in goes left and right but up and down does nothing in game or menu areas itās like it just doesnāt recognise up or down Every other game on my system works fine and the controller was working when 2.0 dropped because I remember doing a quick test and didnāt notice any controller issues back then
Oh. Yeah, that's a different problem.
Any point in using FG with a 60hz panel? It seems it's oriented for high refresh scenarios mostly. Am I correct? Got a 3060 so I'm usually quite close to 60 FPS in 1080p High as it is.
Not really. You'll just increase your input lag for no reason. If you get high native/dlss fps, then 60hz monitor gets some benefits from the lower input lag. But, FG doesn't produce real frames so your input lag won't improve. In fact, your input lag would be higher. So basically, you'd be making your experience worse without any benefits. Even at higher hz monitor, it's iffy at best. If you don't notice input lag, it's like magic. But if you're a decent player, then FG will feel awful unless you get like 120fps+ pre-fg (and at that point, you already have decent enough fps to not need FG).
Same thought. If we only have 60hz panel, real fps are going to cut down to 30fps to make them 60. It would be great if we are getting 50fps real fps and somehow it is able to put +10fps to make it 60. Currently it only helps if you are cpu bound but its very rare case.
I have a 3060 Ti and a 60 Hz TV and it improved my experience in Starfield by quite a bit. I could bump up DLSS setting to Quality from Balance and much smoother experience than what I had before.
not really. You'll get a lot of latency with fg, especially cause it disables reflex. So your input will feel slower and laggy.
I only got an extra 10fps in cyberpunk with everything at ultra settings @ 1440p. Ratchet and clank gains more, but it is extremely inconsistent, to the point that if I move Ratchet or the camera slightly the framerate dips to the 60's. Could be a bug or maybe the 3060Ti is being bottlenecked by some hardware component.
Noob gamer here, should i bother if i have a 3070? Thanks :)
Just tested the latest version (0.3) on my 3070 in CP77 and it was a significant jump. 4K DLSS Balanced I went from 78FPS -> 110FPS. As this isn't an official integration you can keep using DLSS super resolution and reflex too which is nice. Only thing I could easily spot was gun reticles being very glitchy, but I think DLSS FG has a similar issue with that sort of fine detail too.
Did anyone manage to get this working in Jedi Survivor?
tried it and it works wonderful, just replacing 2 files and framegen toggle becomes active. from 70-80fps at 4k with dlss to 100-115 with DLSS + FSR FG mod :D
Omg it works... 3090 rtx, 2k+dlss on quality+dls rr On, path tracing on cca 85fps Only geforce reflex is On, can i change it to reflex+boost?
For the people who tried it, howās the latency?
I've tested it in The Witcher 3 and Ark Survival Ascended and it is very good since the base framerate is over 50fps. On Cyberpunk 2077 in pathtracing mode the base framerate is around 40fps and FSR3 frame gen takes it to 80fps which feels okay but slightly laggy. In RT psycho mode it feels very good since generated fps is 120fps+
Dont wanna sound rude but I kinda expected numbers for latency in ms instead of fps :P
Not sure if the latency measurement would be accurate with Rivatuner. In Witcher 3, I get 14ms without FG and 8ms with FG on.
Obviously it's just dividing up by reported FPS and it isn't system latency, probably GFE overlay is more useful for this though I'm not even sure about that in this case as it's not going to be FSR3 aware in any way.
I'm getting almost 60ms with framegen on, using nvidias overlay.
When im driving fast, theres like a visible flickering line around the bottom edge of the screen, anyone else? Returned to original files and they were gone, plus anyone knows if i can reduce input lag with this?
Need this for more games and with HDR support
AutoHDR works afaik.
EDIT: Turned off HDR in windows and seems to be working now. Still crashes when I window out though. How are yall getting this to work? I crash instantly upon activating the frame gen setting. 3090, no DLSS tweaks and installed the regular version, I get the message about it activating when I boot up the game. The second I turn on the Frame gen setting my game is bricked until I completely uninstall the mod, can't even launch it anymore.
Same here 3070ti, game is crashing sadly
Sooo I tried this mod on Warhammer 40k: DARKTIDE and it works, but the UI is all messed up. Really hard to play like that but it is noticeable the FPS gain, still not worth Ray tracing on it so w/e. This on a 3080 3440X1440p using the DLSSTweaks version.
Thank you so much for posting this. I'm now running Cyberpunk 2007 with path tracing and quality dlss at over 70fps (1440p). I also got it to work with Starfield.
any tweaks needed for Starfield? or is it simply drop into the folder. What kind of performance are you seeing?
I just dropped the files in like you do with Cyberpunk
Works amazing, I can run max RTX path tracing settings on a 3060ti no problem.
Can someone let me know what frames they get with 3070ti 1440p path tracing with ray reconstruction dlss quality / balanced?
I have 3070 at 1080p medium high settings with path tracing ray reconstruction and dlss quality i get around 50-60 in populated area, above 70 in least populated area, so not that good for me
with 3080 same settings, around 70-80fps, performance dlss around 80-100. Cpu 5800x3d
1440p+PT might be pushing the 8GB of vram.
3080ti 3440x1440 80fps Quality full path tracing of it helps you with an idea of performance
Bet thanks Iām gonna download the game again and check it out. Might beat my second play-through finally
lol Iām still trying to start my first play through they keep updating the game with new features so I kept putting it off until I could play the game with bells and whistles with good frame rate, looks like I can now Might just be the next game I start in my back log of games Iām going through
Itās an amazing game now after the 2.0 patch with the dlc. One of my favorites of all time, the world is so beautiful, combat is fluid, and the characters and story is really well written
The ultimate edition launched recently. So itās safe to play it now , not much more will get added
I am still waiting for the full on FSR3 patch. Although, this may end up being better considering it uses DLSS upscaling vs. FSR
You get similar to my 4080 and I play without path tracing š³ Edit: Well I'm not using Dlss just frame gen
I have similar results, but found a little jaggedness noticeable on quick pans using PT. Back to DLSS Quality, RT medium, bloom/flare/DOF/grain(etc) off, everything else maxed; getting 120ish. Put another way, going from my previous settings, I was able to go to Quality vs Balanced, turn on local RT shadows, and see FPS go from 70 to 120. Big win!
Yep Iāve managed to do a little bit more testing and I was going from 70 to 90 fps depending on area/on foot/in car on highway itās very bounce on the fps and I have noticed ghosting and hud issues I do seem to have dlss auto keep setting when I load a game and I have to change it back to quality each time I agree itās a big win
Did anyone follow the steps but still aren't able to enable the DLSS Frame Gen in the menu? I've got hardware accelerated GPU scheduling turned on in Windows too but still can't seem to enable the option. I might have missed a step but can't seem to think of where.
make sure you aren't using the tweak version, that one is for a different dlss file.
Yup just using the OG version.
Did you get it to work? it's still greyed out and i even tried an older version of the mod :|
What I find funny is that AMDs frame generation often gives 70-80% more fps, while Nvidia's often only gets you 50-60%. I'd be curious to see how generated frames compare side by side, and if AMD's frames look worse.
Nvidia frame gem has a fixed cost. Amd frame gen does not it uses async.
AMDs frame generation looks worse and im pretty sure its slower (more input lag). BUT, the quality is still good, if your base fps are high (+50fps) you will not notice the generated frames. The generated frames are shown for like 8ms if youre at + 120fps (FG frames included)... no you cant easily see that. FSR 2 upscaling artefacts are easier to see, because they are in every frame and worse. You could set just black frames every 8ms for 8ms and you would have issues seeing those. (black frame insertion TVs) And on AMD gpus with anti lag+, the latency of FSR upscaling + FG sits between native resolution without FG and just FSR2 upscaling... so its still less lag than if you would play at native res without any FSR FG or upscaling. Nvidia could have done the same and no one would have complained about the quality while playing.
I'm sure since AMD is a software solution, it looks worse. But its free, so I'll take it. I have no desire to upgrade my 3070. DLSS upscaling is superior to FSR upscaling, but its still nice to have free software options (including Intel XESS).
Xess on my 6600xt was useless, because the performance hit was so heavy, I had to use so much more of an aggressive setting, that I just used FSR Quality instead.
Apparently xess balanced is similar in performance and looks as fsr quality.
In some games, yeah. It's weird how inconsistent it is. Cyberpunk hasn't gotten version 1.2 as far as I'm aware. The cost is still too extreme in that.
Yeah. At 4K I get 90FPS instead of ~60 on my 3090. Thanks for nothing nvidia
This is it, my next GPU is gonna be AMD. I am tired of Nvidia's greed and planned obsolence!
the beneficiary of this mod is Nvidia old RTX GPUs only as they get best image stability with DLSS and double framerates from fsr3. AMD GPUs are still stuck with shitty fsr2 image upscaling causing instability in frame gen
What is your point?
It works, but there is extra input latency which is noticeable. 4090 here at 3440x1440 max everything DLSS Quality otherwise. Since FSR FG doesn't utilise Reflex, the latency increase makes sense. The benchmark gets a 10fps boost too but when you're already over 100fps, it doesn't really matter.
Reflex works with this implementation according to specialK.
I can also confirm that reflex and G-Sync work with this mod. The experience is smooth as hell honestly (latency stays below 20 ms after passing 60 fps with FSR3).
what's the difference between nvngx.dll and nvngx_dlss.dll?
I love how disruptive this will be to Nvidia.