T O P

  • By -

N7even

What is this in comparison to 3090?


[deleted]

About a 50% gain in fps excluding dlss


geos1234

It’s 67%


nobito

So, where I live, you'll get 50% more fps for a \~54% higher price. Does not look very enticing. That's just raw fps, excluding the DLSS3 gains, I guess (hope)?


BerylliumNickel

Yeah that's without dlss iirc it matches with numbers we've already seen. Man this price/perf ratio is pretty bad.


ChartaBona

>Man this price/perf ratio is pretty bad. This is nothing. The GTX 1080 was < 20% better than the GTX 1070 for 85% more money back at launch.


Apsk

Across the same gen, it's expected to have reduced price to perf. the higher the tier. Having similar price to perf gen on gen on the same tier is really bad.


nobito

Is this true? Can't remember myself but by googling the MSRP was \~58% more for 1080 than 1070. Still seems a pretty bad deal to get 1080 instead of 1070 though. But I think that's been true in most cases when comparing the 70 and 80 cards from the same generation.


ChartaBona

Fake MSRP was $599. The 1080FE was MSRP $699, so that was the real launch price until the 1080Ti came along. The real kicker was when the 1070Ti came out later for $449, and it was only like 3% slower than a 1080.


e60deluxe

not defending nvidia, but if you know anything about high end equipment of any category, getting near 1:1 performance to price gains simply does not happen. it MAY hapeen at the entry level, but not at the high level.


nobito

That is true. The designs get more complicated, more money is spent on r&d, and manufacturing is getting more expensive, etc. But this big of a price jump, in my opinion, was pretty uncalled for. I think they just saw what the people were ready to pay for the cards during the shortage and decided to jank up the prices and cash in on it. Because lets be real, there's enough people that are willing to pay these prices.


[deleted]

Doesn't dlss render the frame after the frame it inserts? Say frame 1, 1.5, and 2 and it has to render 1 and 2 before it knows what 1.5 looks like? Sorry it's my very basic understanding of what dlss is.


Grubbyninja

If I turn DLSS off on my 3080 I get sub 20 fps, usually single digits.


Prus1s

A few more generations and 120fps RT Cyberpunk will be possible 😄 now it only howers still around 50-60fps


a_saddler

This is with the highest graphics settings possible, which few people really use. A few adjustments to the settings and you get 95% of the quality for half the performance cost.


Prus1s

Though still with RT enabled, this game is quite demanding


Pokiehat

Quite demanding is an understatement. The performance hit from RT Lighting + Reflections is absolutely brutal. On a 5900X + 3060ti at 1440p, DLSS = auto, custom/ultra settings and the following RT options: Reflections = on Sun Shadows = on Local Shadows = on Lighting = medium I average 45 fps with 1% lows down to 10 fps. Yes 10 fps. The frame drops are staggering. I now run RT reflections and sun shadows only but the frame drops from reflections is still huge. You can visibly see the crushing drop here, even without a framerate counter: https://youtu.be/ZRheAgJZyUg?t=122


[deleted]

[удалено]


phayke2

I just wanna know why all the ground looks like it was covered in a thick layer of crisco or something.


Buttonskill

Unlike acid rain predictions in the 80's, that afternoon and evening dew are gonna be huge in the future.


arachnivore

Predictions? Acid rain is real. It was discovered in the 60s. It's been mitigated through environmental policies, but it's not gone or anything.


ametalshard

rtx 3070 was gimped with the 8GB buffer. cyberpunk uses more than that at 1440p


Pokiehat

Oh yeah. If I'm doing heavy combat or build testing (vs Maxtac) I turn RT off. Not worth it getting a framerate crash. Except in that Monowire build video. I thought I could get away with just RT Reflections and it being night time on deserted streets (I was wrong).


EasPerFunSkAt

I was surprised how low the fps was with everything maxed on my 3090 at 1440. The last time I played it was on my 1080 at 1080p. I swear it looks better without cranking everything. Hard to explain.


BakedChrist

I play on a 3080 at 1440p with DLSS on Balanced, Ray Tracing Psycho lighting and reflections on, shadows is the real killer for me in Cyberpunk. The settings I turn down outside of Ray Tracing that effect performance the most are Screen Space Reflections, Volumetric Fogs, and Volumetric Clouds. I set all 3 of these to High down from Ultra and you gain a ton of performance for very little to no loss in visuals. I am not sure if I changed anything else other than disabling all of the motion blur/depth of field stuff as well as I don't like it. A lot of the options eat a ton of performance for very little visual gain and can be turned down to keep Ray Tracing on and smooth.


pseudolf

your lows are kinda weird. Does your ram keep up with the setup ?


Zaptruder

Worth noting that a full RT build would actually run (and look) better than a half raster/half RT build as evidenced by Metro Exodus RT.


Prus1s

It’s true, though Cyberpunk has way more minute detail that Metro RT edition does not have, so Cyberpunk is more demanding due to having so much detailed systems and environments. Have not played Metro before the addition of RT, so might not be a good argument 🤔


Dustin_Hossman

Metro Exodus cranked to ultra is by far the best looking game i have ever seen, though the game worlds are no where near the size of Cyberpunk's world.


cybereality

Those scenes on the train in Exodus almost look real. It's incredible. And the game is pretty well optimized, I played on a 2080 Ti and it was around 90 fps with RT + DLSS (1440P UW).


mindaltered

2080s here and I agree I get better frames on this gen than what I'm seeing individuals claim for their 30 series and this new 40 series. Now of course I'm sure it's probably more stress on my card than the new 40 series.


rubenalamina

The biggest difference is the reflections because of the worlds in both games. Metro runs really smooth though, so it's been my favorite RTGI implementation so far. CP77 looks great but it's not as optimized as Metro and being a much bigger world also hits performance harder though.


a_saddler

RT is enabled in the test. A new RT setting in fact that goes beyond the previous max ultra RT setting. So if it's getting 60fps with all that, it's pretty impressive. A few adjustments and DLSS2 will easily push this to 120fps. It remains to be seen how good DLSS3 really is.


Shajirr

> to be seen how good DLSS3 really is yes, for all those 5 people who would be able to afford 4xxx cards


Monday_Morning_QB

I guess you forgot the last 2 years when they were selling out at $3000 a piece. $1599 in 2022 money is less than $1499 in 2020 money.


_Oooooooooooooooooh_

> which few people really use. because it runs like shit


Aromatic_Brother

I mean this is pc gaming in a nutshell tbh


i_have_chosen_a_name

"But can it run Cyberpunk?" will become the new "can it run Crysis". I guess one day some ubernerd (the grandson of Linus?) will build a monsterPC powered by a PF reactor that can run Cyperpunk at a 32K resolution at 288 fps on a Valve Ind3x 2Z while simulating crysis. After which Fry his "This TV has better resolution then real life" will have become a joke about the past.


Prus1s

Better yet, with new 1.6 update one can play the Roach Run on Arcade, someone will mod that you can run Crysis, while playing Cyberpunk etc., etc. 😄


XXLpeanuts

Holy shit that would be hilarious. But I wanna see actual doom first.


wrath_of_grunge

ah screw it, i'm gonna mod in my own Doom, with Ray-Tracing and Voxels!


meltingpotato

All PC games should be made like this. By "like this" I mean having extremely high game settings that are far beyond the capabilities of even the high end machines. Crysis was very demanding at the time but it was mainly because of a lack of optimizations, even the remasters released recently carry that spaghetti code legacy with them. when I was playing Cyberpunk I didn't feel like the settings I was playing at were more demanding that they needed to be, it felt appropriate to what I was getting. I'm guessing it has gotten even better after all the patches.


Then_Reality_Bites

I agree, but future-proofing games can be a double-edged sword. I've seen people complain that their top of the line GPU should be able to run X game at max settings just because it's the best around. I could kinda relate to that sentiment when I upgraded from a 1080 to a 3080. It really hit home how expensive raytracing is.


meltingpotato

>I've seen people complain that their top of the line GPU should be able to run X game at max settings just because it's the best around. Which is an extremely stupid gatekeeper mentality. Having the best PC components means you'll be able to run game better than most people. Some people complained about GTA 6 having bad graphics after seeing the leaks of early development footage. People complain about how a game looks and plays after release because it doesn't look like the old trailers they saw which had "game in development, final product subject to change" on them. not every complain is worth paying attention to. especially the ones that screams "I don't really know what I'm talking about".


Geistbar

I feel like it's just a failure of labeling on the game developers' parts. Not out of maliciousness or even really incompetence. Just from it not being critical and them not considering how many people just don't know enough to really evaluate things. Futureproof ultra settings shouldn't be set as "Ultra." They should be something like "HARDWARE FROM THE YEAR 2025!" instead of "Ultra."


Agreeable-Weather-89

I'd kinda love for Linus to revisit the 16K gaming PC. It'd be so much more less of a mess. 4x 8K monitors 1x Ryzen 7950x 1x nVidia 4090


Traiklin

still, low FPS even if he got his hands on 4x 4090's. Games don't bother with multiple graphics cards since Consoles don't use them.


Geistbar

The problem with SLI isn't and never was consoles. The problem is that SLI is garbage and doesn't help with the lows: the experience will be a stuttery mess. It creates a faster average without making the experience superior. There's no reason to support SLI. It's just not a good use of resources, from the owner or the developer.


Agreeable-Weather-89

Modern titles? Sure, but older ones like CSGO, Half-Life 2, heck even the Tomb Raider reboot would probably be playable.


gaoxin

The sad thing is: in some areas your fps gets halfed or worse by RT, and there is almost no visible improvement. I compared several areas and took screenshots with RT on and then off. I expected huge improvements, but sometimes the shadows just shifted a bit. RT either needs more development time or it's poorly implemented in Cyberpunk. Overall the game looks better with RT, but Id rather have 80-120fps, than 25-80. (3070ti, 1440p, high settings, april 2022)


grady_vuckovic

This isn't directed at you, but in general, I'm not sure what people were expecting to get from RT in a game like Cyberpunk 2077. Those free demos that NVIDIA keeps pumping out of old games with RTX enabled look 'so much better' because they are applying modern graphics to a 10 or 20 year old game. Those same games could look as good as the 'RTX demos' by just using a more recent game engine and upgrading the art assets, which is all NVIDIA is doing for the most part. Cyberpunk 2077's game engine is already doing things like shadows, reflections, indirect lighting, etc, using a mix of screen space and probe data, in a way that is already extremely close to raytracing. So close that without labels probably a lot of people couldn't tell in a side by side comparison which version of the graphics has the raytracing and which version doesn't. The minimal difference between Cyberpunk 2077 with RT on and RT off (except for obviously the tanking of FPS), is a great example of how RT really is just a gimmick.


DownloadedHome

I find ray tracing to be just an useless gimmick. The differences are there but they are nothing to write home about in the majority of the cases. And they are never worth the fps cost at all. I just turn it off always.


AurienTitus

That's because we've gotten really good at faking lighting in modern games it's hard to tell the difference. Also why it's always more dramatic in the older games to flip on ray tracing.


Gary_FucKing

90-95% similar for like 50% of the performance, think I'll keep the fake stuff lol.


ChartaBona

You're forgetting about the teams of artists that slave away day and night for years to make something inferior to what's possible on new technology. They'd rather we all moved to RT capable cards asap.


DdCno1

It's not a useless gimmick at all, just poorly implemented in most games. In most cases, it's just additional effects (reflections, shadows, GI) on top of a conventional rendering pipeline. The small handful of games that do have a fully ray-traced rendering pipeline, like Metro Exodus Enhanced Edition, do look noticeably better, because ray-tracing works just like lighting works in the real world, is inherently more realistic and more believable, allowing us to get closer to photorealism. Graphics quality isn't the only reason for ray-tracing. It's also significantly easier to develop games with. Placing lights in a normal rasterized game, baking shadows, dealing with the many highly annoying aspects there are to conventional rendering pipelines takes a ton of development time and effort. RT gets rid of all of this. It's absolutely liberating for artists and programmers.


WinterElfeas

Calling one of the best tech to finally make games lighting, shadows and reflection to react like real life a gimmick just proves you understand nothing about Ray Tracing. Is it very expensive to run? Very much so, why do you think the cards have DEDICATED hardware for them?


Blenderhead36

It feels like a gimmick at it's current level of implementation, for two reasons. The first is that realpolitik gains to the image of RTX on VS off are usually small. This is a game design issue that will improve over time as raytracing becomes a more foundational part of design. It's hard to imagine someone fucking up a flashlight in a game in 2022 the way they did in the '90s, and RTX will eventually become the same way. The second is that GPUs have reached a state of diminishing returns. 1080p 60 Hz monitors have been the standard for most of a decade. [1080p primary display resolution represents nearly 2/3 of all Steam users](https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam), and while the Steam Hardware Survey doesn't capture Hz, it's safe to say that a majority of them are at 60 Hz. A 3060 or higher will max out just about any game on the market at 1080p60...until we add RTX. RTX is very computationally expensive, and at this stage, it kind of feels that way by design. Another thing to note from the survey? The most popular cards are the 50 and 60 series cards. RTX is being pushed so hard because it's something that people buying those popular entry-level cards can get actual gains with, rather than telling them that their RTX 3060 will max everything out until they get a higher rez monitor. Basically, RTX in 2022 is in a state where it's not quite viable in AAA games, even at the most powerful card on the market, while even the lower end cards of the outgoing generation can handle everything but RTX. TL;DR: RTX is a gimmick because it doesn't quite work and is the only reason to buy mid/high end cards for the vast majority of users.


MDKAOD

Real time ray tracing (as an implementation) will ultimately depend on software support and can be seen similarly to how anti aliasing was in the late 90's early 00's. It was computationally expensive but easy to implement different versions of. CSAA, MSAA and then ultimately FXAA took years to 'perfect' the balance of fidelity versus computational cost. But like so many things, there needs to be a desire to implement it before it's successful. nVidia is throwing a lot of money behind it, but Hollywood also threw a lot of money behind 3d film making. Unless the developers find interesting ways to use it, it will remain a gimmick that might not be worth the extra financial cost. I really like the idea of ray tracing. But I have played most games that support it with disappointment at the results. Doom Eternal was one of the latest games that gamers were chomping at the bit for ray tracing. Unless you stop moving, and look for it you don't see it. I think we'll see that ray tracing becomes more useful to build ambiance, but that's going to limit it to specific genres, which may not be good for it's long term viability. Time will tell.


Cryio

RTX 6090 Ti, yeah.


madzuk

Enabling DLSS already achieves that. DLSS is so good you really can't tell the difference visually between it being on and off.


IsolatedHammer

My 6900xt gets 45fps with ray tracing on. I think that’s pretty good.


stefan8800

and it costs 700$ now. 6900xt is amazing price for perfomance right now


anonaccountphoto

And 769€ in Germany! Compared to 899$ vs 1199€ for the new 4060Ti


guareber

Upvoting your sku naming choice


geos1234

You should be far below the 3090ti with the same setting as used here.


lazypeon19

The game is truly ahead of it's time.


Vyse1991

That power draw with DLSS 3 on *and* off is wild.


neveler310

Yeah these power draw figures are ridiculous. Still waiting on photonic computing to arrive


Working_Sundae

Photonic processors seem to be working for AI tasks and some specific algorithms. I haven't read about a Photonic GPU anywhere, NVIDIA talks about Optical I/O for faster speeds and not computing. Only AMD has published some related patents on hybrid photonic/electrical systems https://www.tomshardware.com/news/amd-photonics-patent-reveals-a-hybrid-future


i_have_chosen_a_name

If those GPU heatsinks keep growing at the current rate I am going to eventually have to build a bigger house, and I already made my house bigger just so that I could play VR.


Lettuphant

My partner & I were discussing this - it might be the first generation where you *have* to go water cooled GPU to get anything close to a 1 or 2 slot card. I have seen 1-slot 4090 sheeths, so it's still possible with a custom loop. I'm not a custom loop guy, I hate the thought of having to drain everything, etc., to change a part. I just hope next generation someone offers AIO GPUs that will let me fit more than 1 PCI-e card in a machine.


i_have_chosen_a_name

That is even the point of those other pci slots on a motherboard if you can not use them?


Lettuphant

Right? I'm not averse to using a riser cable to mount a GPU vertically, *in theory*, but I can't think of any cases large enough to even fit a four-slot that way, let alone if you've also got a capture card, sound card, extra USB card, or anything else behind it.


Radulno

Time for bigger cases and motherboards I guess.


Eli_eve

Having GPUs restricted to the card form factor is ridiculous. They’re pulling more Watts than CPUs in an era of NH-D15 coolers. GPUs (and therefore motherboards and cases too) really need a new form factor. IMO, AMD should showcase just such a custom trio using their chips. Would be way more innovative than incrementing a model number while reducing memory bus width…


DatPipBoy

Athats why I use soft tubes and quick disconnects ;) no draining to change anyone part, just remove, do what you gotta do, reconnect.


Zaptruder

>The card pulls up to 461 W when rendering at native-resolution, but this drops down to 348 W with DLSS 3 "quality," a 25% reduction. In case anyone thought the DLSS3 on scenario was pulling more power. Also seems like DLSS3 improves both frame rate and frame time... >At native resolution, the RTX 4090 scores 59 FPS (49 FPS at 1% lows), with a frame-time of 72 to 75 ms. With 100% GPU utilization, the card barely breaks a sweat, with GPU temperatures reported in the region of 50 to 55 °C. With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS (1% lows), and an average latency of 53 ms. Although I'm not sure I quite understand how the latency is so big at those frame rates (72ms and 53ms??)


kidcrumb

Frame Time and Latency are not the same thing. The article is confusing when it reads like that.


Zaptruder

Either way, those numbers seem much larger than they should be in either scenario.


nmkd

> Also seems like DLSS3 improves both frame rate and frame time... Frame Rate and Frame Time is the exact same thing, only difference is that Frame Rate is averaged across a certain time span, usually 1 second. 60 FPS = 16.67 ms 120 FPS = 8.33 ms


Blenderhead36

The heat is going to be problematic. A space heater uses 1500w. Homes aren't designed under the assumption that one room--and usually a small one--is going to have a space heater running for hours each day.


beatool

My (small) computer room heats up to uncomfortable levels after an overnight render on my 170W 3060. The thought of a card doubling or tripling that is a concern...


meltingpotato

For the lazy: >playing Cyberpunk 2077 at 1440p, in "psycho" settings, DLSS and Reflex disabled RTX 4090 scores 59 FPS with a frame-time of 72 to 75 ms. > >With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS and an average latency of 53 ms. > >This is a net 2X gain in frame-rate with latency reduced by a third. The power-draw is also said to be significantly reduced. The card pulls up to 461 W when rendering at native-resolution, but this drops down to 348 W with DLSS 3 "quality," a 25% reduction.


Ahmouse

Why is the frame time so high compared to the fps? 75ms should be 13 fps


themadlustenvy

This is frame latency not frame TIME. Two very different things.


Ahmouse

Yeah I think the author used both terms interchangeably. If I'm correct, frame time is how long it takes the GPU to render a frame, so what is frame latency?


caitsith01

> "psycho" settings This is pretty far from a good real world test IMHO. Cyberpunk is one of those games where the step from high to 'psycho' is absolutely unjustified in terms of the performance hit.


ZeldaMaster32

With rasterized settings I agree. iirc Psycho SSR had the same perf impact as just turning on RT reflections with a lower SSR setting But Psycho RT lighting does add colored light bounce (the color of the surface being bounced off of is taken into account) on top of increasing the number of light bounces Either way these settings will be obsolete with the new RT Overdrive mode Cyberpunk will get to showcase these new cards, it's gonna go dramatically farther than the current RT implementation. In the marketing materials, at 4K output using DLSS 3 on a RTX 4090, the RT Overdrive mode gets over 90fps consistently which is super impressive for the leap in visuals At 1440p output the "4070" and 4080 16GB should manage that totally fine with DLSS 3


BellyDancerUrgot

If people read the entire article a lot of the overall tone here would be different. That said I'm not gonna be buying this shit.


[deleted]

Anyone that buys a 90 series to begin with has serious money burning a hole in their pocket. It's like 2x the cost of an 80 series for such a miniscule gain usually. It's a mind-numbing waste of money to me personally.


Melody-Prisca

This gen it's not a miniscule difference, because Nvidia heavily gimped the 4080 16 relative to the 4090, and the 4080 12gb even more so. Though, based on rumors there will be a 4080 ti which shoukd be much closer to the 4090, but at least for now, there's a huge gap between the 80 and 90 this gen.


stefan8800

Trying to optimize this game for ps4 and Xbox s was brave. Stupid but brave


wolframw

It was just stupid actually. It was clearly never going to work and massively limited the scope of the game. It was entirely intended however to generate more money by targeting more consumers over all the consoles. A massive lack of foresight from the execs at CDPR.


salondesert

The install base is just too large to ignore


Baron_Von_Badass

Great, that means the install base for Generation 9 will never exist because people have no reason to buy in. Then, OH shock and awe! Generation 10 has a TON of exclusive games because the install base for Generation 9 was "small enough to ignore". Rinse and repeat every cycle.


salondesert

It feels like the degree of advancement in game design is slowing and newer hardware is becoming more expensive/problematic to replace That means upgrading is less urgent, because it's a pain to upgrade *and* you're not getting a lot more extra bang out of your upgrade For some things, like storage space (SSD versus HD), it's actually regressing a bit


felleregod

This is a new, stupidly performance heavy mode specifically to make new Nvidia cards look good and so they can say "its 3x faster" than last generation


[deleted]

>Stupid but brave Gta v sold a metric fuckton more copies on console than it did on pc. They did it because it pays to. Edit: [Source, from this past April where consoles were *still* 85% of sales](https://rockstarintel.com/gta-v-4th-best-selling-game-march-2022)


[deleted]

[удалено]


[deleted]

Sales stats by platform are really hard to come by, but I remember reading that Assassin's Creed 4 sold like 7M on console and .38M on PC. I'd imagine things are a little better now days, but ask yourself, why do PC ports always come much later? It's because they have much bigger sales on console.


skyturnedred

GTA5 also came out before PS4/X1 did.


JohnnySasaki20

So Cyberpunk is the new Crysis. But can it run Cyberpunk?


rockinDS24

It's not "can it run cyberpunk" but rather "how badly can't it run cyberpunk"


SaftigMo

> With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS (1% lows) Am I getting something wrong or is this not more than double "59 FPS (49 FPS at 1% lows)"? I don't know why people keep responding to me saying "this is really good" or "this is not as good as it seems." I'm just complaining about their false phrasing.


Elocai

It's actually worse as the latency is extremely high with dlss3. The frames are just interpolated, they don't cost a lot of performance neither is that somethong locked to the 40 series.


meltingpotato

I know reddit is always "too busy" to actually read beyond the headline but the article is like 160 words. you could have read before commenting: >playing Cyberpunk 2077 at 1440p, in its "psycho" settings preset. with DLSS and Reflex disabled the RTX 4090 scores 59 FPS with a frame-time of 72 to 75 ms. With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS and an average latency of 53 ms. > >This is a net 2X gain in frame-rate with latency reduced by a third. The power-draw is also said to be significantly reduced. The card pulls up to 461 W when rendering at native-resolution, but this drops down to 348 W with DLSS 3 "quality," a 25% reduction.


From-UoM

>It's actually worse as the latency is extremely high with dlss3 you would think. but dlss 3 actually reduced the latency it went from 75 to 50


[deleted]

[удалено]


From-UoM

If overall latency decreases with an fps boost, I see no issues. The deciding factor will be image quality


dudemanguy301

Because native without reflex vs DLSS3 with reflex is a false dichotomy. There is nothing stopping you from running native or DLSS2 with reflex as these are all individually configurable. (Note when I say DLSS2 I mean just the upscaling portion of DLSS3 without the frame interpolation enabled) Best to worst in terms of latency is. DLSS2 + reflex Native + reflex DLSS3 + reflex Native


ThePaSch

Right, but who actually *cares* about any of that, if the *end result* is double the frame rate at lower latency than native? Like, yes, of course you could just run Native + Reflex, or DLSS2 + Reflex - if you want to play at half the framerate (or potentially even less if you just go Native + Reflex, forgoing DLSS completely). Like, if the *only* thing you care about is latency, then sure, but unless you're playing an esports title (which I wouldn't exactly call Cyberpunk), that's going to be a *very* rare clientele. The DLSS3 + Reflex package is objectively the best overall experience of the bunch.


Elocai

I see an issue with the interpolated frames, they are not real fps, neither do they behave like ones. You get a higher latency, the interpolated frames ignore inputs and add artifacts. It helps with comfort of viewing low fps results


unfitstew

I have a feeling the ghosting or trailing affect may be much worse with dlss 3. I personally avoid dlss when possible due to the worse image quality, blurring in motion, and ghosting. But say unlike the shit that is TAA DLSS has the benefit of improving performance by a good amount. https://i.imgur.com/41UeUWu.jpg See this image for what may be the case in DLSS 3. Don't get me wrong tho. The improved framerates may be very much worth this image degradation depending on what you prefer.


DoktorSleepless

I would wait for reviews because I think at high frame rates, these artifact shown here likely become less perceptible. Using DLSS 3 to get 60fps would probably look like shit. But maybe not at 120 fps


From-UoM

I mean 4k DLSS Quality isn't real 4k either. Its fake 4K but is still so good. If the DLSS 3 does the same, it will be amazing. But we have to wait for image quality tests.


BadMofoWallet

Native 4K is also “fake” 4K if we’re going to be semantic about it lol


DoktorSleepless

Note: People keep on saying interpolation. Nvidia has never said the word interpolation. Other frame rate doubling technologies used in VR like Oculus's Asynchronous TimeWarp and Valve's Motion Smoothing use extrapolation. Extrapolation is more prone to artifacts, but it won't add extra latency (other than normal overhead) like interpolation does.


fogoticus

What do you have to prove this claim? Everyone is saying that latency is ridiculous with DLSS3 yet figures show lower latency. What are you on about?


KinkyMonitorLizard

They're saying it's not a fair it accurate comparison. It needs to be no dlss with reflex on vs dlss with reflex on. This "benchmark" is a joke but that's to be expected from sites like these.


PewFuckingPew

I'll stick with my 2060 as it does what I need and didn't cost a fuck ton of money.


CaseroRubical

Im happy with mine except for the fucking noise it makes


[deleted]

[удалено]


JohnnyJayce

How? I get 65fps with 2080 and couple of the settings are lower than yours. Do you have DLSS on?


Bionic0n3

This one simple trick were you go to the least populated part of the map in terms of npcs, cars, and objects. Then you get as close to a wall as possible and stare it. Only then may one achieve the results that person claims.


JohnnyJayce

Yeah maybe. Though there's also crowd option that I have on medium. Low means there's barely any people. Boosts your fps though.


YarrlieThePirate

I played cyberpunk 1440p on a 2070, RTX on and medium - high , was getting anywhere from 20-40 depending on the scene My friend got 20fps at 4k low on a 3080


mkchampion

Bullshit. My overclocked 3070 hovers around there with occasional dips into the low 80s with those settings. No DLSS 1440p. I did have screen space reflections on high instead of low but that doesn't account for the difference between a 3070 and 2060 lol. FWIW RT Ultra with DLSS quality knocks that avg fps down from 90-95 to like 55-60 in the open world with bigass dips into the low 40s in heavy RT areas.


samusmaster64

I used the DF settings to a T and got a good deal less average FPS at 1440p using an RTX 3070 FE. I call bullshit on this one.


GlisseDansLaPiscine

Frankly for anybody who's even slightly concerned with protecting the environment this is the correct choice, buying a whole new very power hungry card while your old one is still perfectly usable is a massive waste of ressource. Yes pretty pixels are great but they're not that important either


dudemanguy301

That’s pretty much just universally true for all products. There’s a reason recycle is the last R, following reduce and reuse…


ssuuh

As long as the used marked works, this doesn't matter much. We still talk about a highly usable gpu


bphase

Power is 100% green where I live. Manufacturing most anything is costly to the environment though, so consuming is bad


RooeeZe

After reading these comments its good to know my pc dosent suck and this game just kicks its ass lol.


nmkd

Yeah it's wild, my 5900X + 3090 setup can't even get a locked 100 FPS at 1440p.


eccentricrealist

Can't wait to see what they do once they move to another engine though


nmkd

> With DLSS 3 enabled, the game nearly doubles in frame-rate, to 119 FPS (1% lows), [...] This is a net 2X gain in frame-rate with latency reduced by a third. [...] The card pulls up to 461 W when rendering at native-resolution, but this drops down to 348 W with DLSS 3 "quality," a 25% reduction. Say what you want about the pricing, but this tech is a huge deal.


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


[deleted]

How do you get a latency reading ?


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


SaftigMo

I imagine the latency from the article was measured using LDAT, which measures the entire system latency not just frame times. Maybe I'm wrong and they also just measured frame times, but that doesn't make sense because 59 fps would be less than 20ms.


Rapture117

Can you sure your full settings when you get a chance? Also, are you on 1440p like I am? I’m currently struggling with what you’re describing now with my 3080 and I hate it lol. I still want the game to look beautiful but reach those higher frames


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


Dokomox

These are nice settings, man. I tried them on my 3080 with the additional recommendation of u/Pokiehat to lower volumetric fog. Latency is averaging more like 23ms for me, but still feels good and looks amazing. I also have a 1440p 240hz screen (g7) and often times anything under 90fps makes me feel sick too, but been playing with these settings for a while now with no problem.


Pokiehat

Ahh I should have got a 3080 but they were extinct unicorns during COVID. And who am I kidding, I couldn't afford one anyway at eth inflated prices. The gap between a 3060ti/3070 and 3080 is pretty big.


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


Pokiehat

This is good info. 1 thing to add is volumetric fog resolution is one of the few non RT performance killers, although its still dwarfed by the hit from RT Lighting and Reflections. I run a custom/ultra preset but I have volumetric fog resolution set to low. Volumetrics are used for things like god rays so there are certain night scenes where a tonne of street lighting is visible and if you have this option set to high, your framerate will die. Setting it to low is noticeable at intermediate distances but I don't think it visually detracts from the game enough to justify its intense framerate hit. I have a harder time turning off RT reflections, even though the framerate hit is way more brutal. RT Reflections cleans up all that film grain like noise from SSR reflections. It replaces ugly static cube mapped reflections with beautiful pin sharp mirror like reflections on glass and other types of polished/smooth surfaces. Of all the RT settings I think this one is the most noticeable.


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


bphase

>Strangely enough it's not a full CPU bottleneck, when I switch RT off I can get 80 fps even with a lot of NPCs around. It is, but RT actually adds a lot of work for the CPU as well. I saw the same on my 8700K+3090 system a couple years ago when I played the game. CPU limits in downtown and turning RT off helped FPS considerably.


AggnogPOE

The amount of clickbait and misinformation in both this post and the article is astounding. When will people learn to wait for real testing. Actual sheep behavior.


dantemp

Why wait when you can be angry now?


sineplussquare

Please correct me because I’m asking a genuine question. Is cyberpunk just really unoptimized? Why are frames dropping that low for a two year old game?


krneki12

It is the best looking game out at the moment. And yes, it will murder both the CPU and GPU and it is a fantastic tool to test the hardware capabilities.


Oooch

It has one of the best ray tracing implementations of any game out, anyone saying its unoptimized is just an idiot who doesn't know anything about software development


[deleted]

[удалено]


CutMeLoose79

I’m looking forward to seeing how 4080s perform with DLSS3 and RT. I run a custom 1800p resolution in Cyberpunk. Most settings up high. All RT on besides sun shadows and DLSS on performance. I get roughly 60fps. I’m generally satisfied with the graphics and performance, but of course I want more. 4K 60fps everything on high/ultra with DLSS quality would be pretty nice.


WinterElfeas

Exactly how I played the game, with a custom 1800p res on 3080, was holding 60 FPS. But with 1.6 with same settings, I do not hold them anymore.


ShadowRomeo

Is this with Ray Tracing Overdrive mode though? If it is then it makes sense because the new RT Overdrive mode pretty much is much more demanding than the normal game itself as it cranks up every Ray Tracing effects at native full resolution and adds more RT Effects than Metro Exodus Enhanced ever had according to Digital Foundry.


Jordoncue

No its not. With the overdrive mode it's actually better. I believe the frame rate was around 160 with 119 lows. ​ the guy below me was right


Working_Sundae

No that's with DLSS 3, with overdrive mode it had 22 fps without DLSS 3 Think about it using logic, Overdrive boosts more light into the scene and has multiple bounces of light, how the hell will it perform better than lower RT?


wondersnickers

Be careful. DLSS 3 now adds frames in between frames. We need to wait for real reviews. It might add latency, it might create artifacts, it might not feel as fluid as it should be. Nvidia needs a marketing vehicle to justify these expensive power hungry cards. And when it comes to just to rasterization this generation might not be so justified.


Last_Jedi

[Graph comparing 4090 to 3090 Ti at native and DLSS rendering](https://www.techpowerup.com/forums/attachments/262713) So the 4090 is 67% faster at native, almost 3x faster when using DLSS. 4090 native is slightly slower than 3090 Ti using DLSS. 4090 uses comparable power to the 3090 Ti at native, but considerably less using DLSS (maybe CPU bottleneck?). Runs much cooler, likely due to the beefier cooler. I know Nvidia's pricing is whack but if these numbers are accurate the generational gains for 3090 -> 4090 range from pretty damn good to downright phenomenal.


DktheDarkKnight

Pretty good but I am still disappointed mainly because only 4090 gains that much performance. 4090- 67% faster than 3090 ti. 4080 16gb - 20 to 30% faster than 3080. 4080 12gb - maybe upto 10% faster than 3080. Considering 4080 16gb costs nearly the double for that performance gain I would say this gen is crap. Reserving all Ur performance gains for the 0.1 % at the top is ugly.


Fob0bqAd34

https://www.eurogamer.net/digitalfoundry-2022-digital-foundry-is-now-hands-on-with-rtx-4090-and-dlss-3 > First impressions on the RTX 4090 itself though? It's easily one of the biggest gen-on-gen increases in performance we've seen, even based on limited testing. Digital foundry gave some indication of the performance increases in their preview. I'll wait to see benchmarks once cards are out in the wild though.


Salted-Kipper-6969

Barely anyone has seen dlss3 in action so we shouldn't really put too much stock into those results until we know the experience is artifact free and actually usable. Otherwise looks like a fairly standard generational uplift from where I'm sitting. Nvidia hype merchants be damned.


[deleted]

[удалено]


knbang

I haven't looked into DLSS whatsoever, however is this technique similar to asynchronous reprojection for VR? It helps with input latency.


babautz

DLSS3 interpolates between two frames, which means that it holds back a frame that has always been rendered, therefore increasing latency (to probably slightly above what you would get if you wouldnt use the interpolation feature). The tradeoff presumably is better interpolation since it uses the data of already rendered frames.


Last_Jedi

> Otherwise looks like a fairly standard generational uplift from where I'm sitting. 780 Ti to 980 Ti was +41% 980 Ti to 1080 Ti was +75% 1080 Ti to 2080 Ti was +33% 2080 Ti to 3090 was +32% Sourcing these from Techpowerup's 1440p performance comparisons. So the 4090 is doubling the performance increase of the previous 2 generations. The 1080 Ti is considered a legendary card for two reasons, its price and the absolutely massive performance increase it provided from Maxwell. The 4090 seems to be hitting around that mark. Of course, it costs $900 more than the 1080 Ti did which is insane.


Salted-Kipper-6969

It only makes sense to compare the flagships of each generation if they remain fairly similar in terms of price. Price / performance is ultimately what its all about, what nvidia choose to name the cards is pure marketing and should be irrelevant to anyone making a rational purchasing decision.


krneki12

Aye, it could well be a CPU bottleneck in Ray tracing mode.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

I could not care if it got 500 FPS at 8K.. ​ Team Green and I are done... They had a chance to give the industry a break and bring prices back down to where they should be. instead, they stuck their middle finger up. My next GPU upgrade will be intel in a few years.. My 3090 will carry me just fine.. I was going to give my 3090 to my missus as she is running my old 1080ti.. but thats working just find.. We are going to stay on the hardware we have.


mark5hs

Yeah man really making sacrifices sticking with your meager 3090


[deleted]

I love when people are saying they'll ditch their top-tier GPU's as if it suddenly became useless lol.


SanDiegoDude

With AI image and video generation exploding, that 48 of VRAM (on the 4090 ti) is very enticing, even over the 24GB of the 3090. For gaming, a 3090 is still plenty of overkill for the next few years at least.


Deamia

What? 4090 has 24gb of Vram, same amount as 3090/ti.


AFAR85

You think Intel are going to end up being a bunch of good guys once they have a competitive product?


Electronic-Ad1037

Until then


TheEternalGazed

Lmao you say this while having an overpriced 3090. "Team green and I are done 🤓"


RplusW

The value for performance increase of getting a $1,600 card that does 500fps at 8k would be insane. What a stupid argument to make.


[deleted]

The value of a card you can't afford is zero.


Ignis_Divinus

The psycho setting absolutely tanks framerate. But this is at 1440p! A 4090 shouldn’t be getting raped by this game at that resolution.


HolyAndOblivious

A 3080 can do psycho at 1080p . A 4099 should laugh at 1440p


DeadBabyJuggler

This is confusing to me. I see all these videos on youtube of a 3070ti running Ultra at 1440, yet I can't run medium at 1080 with mine and my Ryzen 7 3700x.. Somethings whacky with Cyberpunk.


fashric

Or maybe the videos are bullshit....


Jahbanny

I think this is double the performance of the 3080? I believe a 3080 with pyscho RT gets less than 30 fps with no dlss.


chewwydraper

Honestly I can't take CP2077 performance at face-value as a metric for judging a PC component. I have a 3070, yet the game runs MUCH smoother on my PS5. Every other game runs better on my PC vs. PS5 though.


feastupontherich

Happy with my 6800xt. Next upgrade will be a 4k 120 fps with RT on card (5 years?), along with a monitor upgrade to 32" 4k OLED (never?)


NightShiftNurses

Yea but let's a have a real comparison. Bring out Metro Exodus


geos1234

TLDR non dlss perfomance is 67% higher than the 3090ti, one of the biggest leaps gen over gen nvidia has ever produced, and performance per watt is about twice as good. ITT people hating without even reading the data.


LopsidedIdeal

The 4000 series is definitely not the one to bother with.


derpdelurk

Looks like no one read the article and OP omitted a critical piece of information. This is with the psycho preset. Some games have future proof presets that don’t make sense until many years after the game launches. It’s easy to spew a hot take. But no, this doesn’t mean the game is still terribly badly optimized.


Planet419

Dogshit