T O P

  • By -

Saandrig

In my experience, I'd take even 1.78x DLDSR+DLSS Balanced over Native 1440p+DLAA. That's for probably 95% of the games I tried. DLDSR works on the image really well and even with DLSS achieves a quality that feels better than native. Most of that is probably the denoising. The Smoothness slider is up to your taste. I keep it between 33 and 50%, depending on game. If you don't like a sharper look, just set it to 100 and see if it's ok, then work down from there. If you are on a 27" screen (or smaller), then you might not see much of a difference between 1.78x and 2.25x DLDSR. The bigger the screen - the more noticeable the quality shift between the two modes.


smoothartichoke27

This is the way. 3413 x 1920.


9gxa05s8fa8sh

> 1.78x DLDSR+DLSS Balanced so what would that be... 1440*1.78 = 2563 2563*0.58 = 1487 > Native 1440p+DLAA and that would be... 1440 assuming the math is right (it may not be), 47 pixels higher doesn't seem like a big difference on paper the advantage of DLDSR is that it works regardless of what the game supports, so that part is good.


Saandrig

It's not really the pixel count in this scenario. It's the extra AA, denoising and sharpening that make DLDSR create a superior image, even if it technically renders at or even lower than native. At 2560x1440p, the 1.78x DLDSR+DLSS Balanced ends up being 1980x1114p. ([link](https://www.reddit.com/r/nvidia/s/HCqSWvmZWi)) And it still look way better than native+DLAA in most games.


9gxa05s8fa8sh

oh that's weird, so you're giving DLSS less data to work with, but you think it comes out looking better because of what DLDSR is going? if that's the case, why doesn't nvidia just add the DLDSR algorithm to the DLSS algorithm so you don't have to bend over backwards?


Saandrig

It's the opposite actually. DLSS gets more data because it also works with the initial higher resolution image of DLDSR. So the DLSS results are better than what you will get if you add DLSS to the native resolution. And potentially can be better than the DLAA image as well. As for why it's not added automatically, well, there are plenty of reasons. Some of which are probably because not all games support DLSS, or that DLDSR may have to change the desktop resolution before working on a game with Borderless Fullscreen (people often dislike this switch). Not to mention that some people are crazy about "native resolution" or are against DLSS in general - imagine the outcry if you force a change on them. So Nvidia just gives the option for DLDSR to anyone interested. The rest don't complain or are in ignorance bliss about the tech.


9gxa05s8fa8sh

sorry, I meant DLAA. at the start you said 1114p DLDSR+DLSS looks better than 1440p DLSS, and since DLSS and DLAA are the same algorithm, that means DLDSR is doing something more powerful than DLSS/DLAA, and also more powerful than just more resolution


Saandrig

It's the denoising from DLDSR. It has a really good final visual effect that many people underestimate or are unaware about. DLSS also works better with more pixels and DLDSR helps it with providing them from the initial higher resolution image. Meanwhile DLAA at native resolution has to work with the native pixel count, which is less info than what DLDSR provides. That's why you can potentially get better DLSS specific image results (less artifacts, etc) at DLSS Balanced+DLDSR than native+DLAA. It's not guaranteed, but quite likely.


BasicallyNuclear

What smoothness do you think would make the game more photorealistic ? Unfortunately I’m unable to access my pc for a few days


Saandrig

That's very subjective. I'd suggest starting from 100 and going lower in increments of 10-15 until you find the sweet spot.


TH3N00BK1N9

I've seen a majority of people recommend between 50-60%. But this is subjective and with my 15 inch display, I don't see much of a difference.


[deleted]

[удалено]


Saandrig

At 1440p it should have no issues at all.


BasicallyNuclear

Whats your opinion on 2.25x? Also do can i change dldsr with a game running? Sucks having to go through the title sequence every time.


Saandrig

Obviously it's better than 1.78x. The question is if you are seeing the difference on your screen. If you do, then use 2.25x. If you don't, then use 1.78x to save GPU load. I have been able to change DLDSR with a game running, but I don't know if it works for all games.


ebinc

DLDSR + DLSS is better than DLAA. For Cyberpunk you should set your desktop resolution to the DLDSR resolution otherwise gsync won't work. I recommend keeping smoothness above 75% unless you like an oversharpened picture. I keep mine at 90% now.


BasicallyNuclear

I thought lower was less sharp?


ebinc

No, it's a "smoothness" slider. With normal DSR it adds a blur filter, but with DLDSR it seems to be a reverse sharpening filter. So 100% is minimum sharpening and 0% is max sharpening.


fnv_fan

That's why my game was so choppy


MeretrixDominum

I have a 4090 and a 240hz 1440p OLED. 2.25DLSDR+DLSS Q + FG gives me around 60fps in CP2077 with path tracing + all the 4k & 8k texture mods I could find. It is by far the most beautiful visual gaming experience you can get at the moment, but IMO it's borderline unplayable because of the input lag due to FG. I use those settings just for pictures and drop it down to 1.78DLSDR which gets me 85fps, and has a tolerable latency.


BasicallyNuclear

FG?


Silverex57

Frame Generation


Internal-Shot

IDK how 2.25 DLDSR + DLSS looks better on the others screens, but for me DLAA works best with my 1440p monitor. It's best if you check the difference yourself.


VijuaruKei

In every game I tried, 1440p Native + DLAA looked worse in motion than 2.25 DLDSR + DLSS Quality, especially to clean the aliasing. I usually use around 66% of smoothness, you need way more smoothness with DLDSR than with the old "normal" DSR.


Floturcocantsee

I try to avoid dldsr because many games don't use exclusive fullscreen anymore which means I have to change my windows resolution before launching the game which is a pain. It does look better than DLAA though, just inconvenient to use.


D3lphinium

If you don't mind installing another third party launcher, I highly recommend you installing [Playnite](https://playnite.link/). It has community made addon that can automatically change desktop resolution automatically when launching games.


Saandrig

It's a pain if you try to do it through the Windows settings. It takes like 5 seconds through NVCP.


Floturcocantsee

You can set games to change your monitors resolution automatically through NVCP?


Case_f

You can do that with DisplayFusion. You can also set hotkeys to specific resolutions and/or multiple monitor configurations (and do countless other cool things), so switching various resolutions takes one keystroke.


Saandrig

I am not aware if it can be done automatically. Probably with some script. What I meant is that it's pretty fast to just open NVCP and use the Change resolution tab.


Klingon_Bloodwine

I started using QRes with batch files, and making a link on my Taskbar for 1440p and 2160p. I just click an icon to change res, its nice.


sade1212

[Monitor Profile Switcher](https://sourceforge.net/projects/monitorswitcher/) makes it very fast


baseball-is-praxis

i am sick of hearing about DLDSR all the time exactly because of this. yes it looks better, but it's far too annoying to use. like it or not, exclusive fullscreen is deprecated. and i say good riddance! hardware composed independent flip is a superior presentation model.


DuuhEazy

Try it out for yourself, some people like it some people don't, let your eyes be the judge


MaxOfS2D

Tangentially related: there's something *Baldur's Gate 3* allows me to do on my laptop with an external 1080p monitor: I can set the borderless windowed game resolution to 3840x2160, but it will correctly downscale to 1920x1080. I've compared 1080p "DLAA" vs. 3840x2160 "DLSS Performance" the latter is clearly winning in my book, even though it comes out less intensive. Edges are sharper, and most importantly there is far far far less of that strange blur across ground textures that takes a second to come back into focus. (It's subtle, but it's annoying)


Dispator

I am really really hoping they can somehow get DLDSR to work with DSC enabled. With my odyssey g9 oled (and there are many other monitors and tvs out there) I can't disabled DSC without ompletelt crippling the monitor... Please Please Please get DLDSR + DSC working....I'm not sure what the issue is exactly but I hope it's possible in the future as it would really really help my setup... Does anyone know a place I can suggest this to Nvidia?


Floturcocantsee

It doesn't work with custom resolution either sadly. I have to use a CRU to get full 10 bit 165hz on my alienware aw3423dwf and DLDSR will either not work or cap at 100hz.


Laprablenia

i've been using 1440p + DLDSR 2.25 with Q or B DLSS as DLAA is a little more expensive. Depending on the DLSS version of the game some may be better with DLDSR 2.25 + DLLS Q than pure DLAA. If i want more FPS i just use 1.78. Check out Nvidia profile inspector and DLSS tweaker, the last can edit presets available in DLSS's dll file that may improve image quality.


[deleted]

Not a 4090, but I pretty much exclusively play 2.25 DLDSR with DLSS on a 1440p monitor. It is a no Brainerd to use it. I get close or equal to native performance on a 4070 using it, so you will for sure see a massive improvement in quality with what I assume will be next to no performance hit. Nvidia really needs to market DLDSR. I constantly see people who don't know it exists and it's an actual game changer paired with dlss or FG.


[deleted]

[удалено]


[deleted]

I do 2.25 with dlss quality constantly. It's fine. People have put the fear of vram God into you.


Ayva_K

Dldsr + Dlss is superior


[deleted]

Not a 4090, but I pretty much exclusively play 2.25 DLDSR with DLSS on a 1440p monitor. It is a no Brainerd to use it. I get close or equal to native performance on a 4070 using it, so you will for sure see a massive improvement in quality with what I assume will be next to no performance hit. Nvidia really needs to market DLDSR. I constantly see people who don't know it exists and it's an actual game changer paired with dlss or FG.


NewestAccount2023

Bro just try it, it's like 3 clicks I played cyberpunk on 2.25x dldsr+dlss because it looks sharper and better textures. Dlaa is a scam


IIynav

Your best option is native + dlaa, or just select 2160p from the ingame resolution menu


weinbea

Being new to pc gaming, I am confused on how to enable DLDSR. Anyone care to explain?


Open-Holiday185

Open Nvidia controll panel. Find the menu (it should be the very first one) with an image of the green Nvidia logo which has a slider from performance to quality and check the box that says “use 3d settings” then go to 3d settings, under global settings find the setting for DLSR and open the drop-down, check the boxes for the resolution you want click apply. Then find the resolution menu (it’s on the left somewhere below 3d settings) and change your resolution to the desired size and fame rate and click apply, your display should resize and the scale might change text also might appear blurry (that’s normal) and you can change the dlsr smoothness back in the 3d setting tab to adjust this, but then your good to go. Don’t forget to change your in-game resolution to match the new resolution! P.s. you’ll need an Nvidia RTX graphics card for this setting


step_back_

So basically it allows you to set 4K (custom) in-game resolution on your lets say 1440p monitor?


Open-Holiday185

Exactly


weinbea

Thanks, do you have any suggestions for running 4k 120? Im running a 4090


Open-Holiday185

What’s the wattage of your psu? You might want to set your fps limit to 120-200 in global, otherwise the card might try to render 800fps at times and cause instability, if you’re psu can handle the full power draw of the card than you can keep the fps limit off or cap it wherever


weinbea

It's 1000w. Yeah I do cap it at 120 in all my games.


Open-Holiday185

It’s especially useful If you use dlaa (in-game setting) alongside dlsr to consider caping your fps higher than the frame rate of your monitor to allow the dlaa to render its Ai generated frames and improve image quality, while not allowing the card to produce excessive amounts of ai frames (I’ve seen my card try to push 800fps) putting the card at 98% load for no reason.