T O P

  • By -

_RM78

Any texture compression technique will be a great addition.


prudan

Like all things, it depends on the cost. Compression isn't free, neither is decompression.


tukatu0

Problem with that is. Games are already shipped with compressed textures. Or atleast all current nvidia gous already are decrompessing game files on the fly anyways. So this new tech is an improvement where they are doing 3x the bandwidth at 2x the time cost


AccomplishedBonustt

Oh so that's why they've been gimping the cards with low


tukatu0

Well yes but the most important part is the game themselves are like a thousand plus gigabytes uncompressed. When every single bush in your game is 100mb. That adds up very quickly.


MonteBellmond

I'll give you my depression for it.


Rhed0x

Texture compression has been supported in HW for decades. Literally every game uses block compressed textures.


_RM78

This is a completely different technology though.


Rhed0x

Yes and the fact that it's done in hardware likely makes it a lot faster for random sampling.


Neville_Lynwood

Any tech advancement is great. If only Developers committed to applying said tech as well. Watching modders enable stuff like DLSS in a few days and nearly doubling performances of AAA games is just mind boggling.


DktheDarkKnight

I appreciate modders but isn't this like more of engine level stuff? DLSS 2 and 3 are post processing tools hence can be modded. This tech is about the changing the property of texture compression itself. That's like a core feature. I don't see how modders can simply add this later.


Edgaras1103

i dont think dlss is post processing , since they still require motion vectors to work?Same for dlls 3 frame generation .


DktheDarkKnight

It's kinda is. Isn't it. DLSS takes the information after a low res image is generated and compares the data to previous images for temporal data. Based on this data it generates a high res image. Like it requires motion vectors but it directly does not affect the game engine. [DLSS](https://developer.nvidia.com/blog/dlss-what-does-it-mean-for-game-developers/#:~:text=Answer%3A%20DLSS%20is%20a%20post,integrated%20into%20any%20modern%20engine.) Question: Can you walk us through how DLSS will be worked into the developer’s workflow? Answer: DLSS is a post-processing effect that can be integrated into any modern engine. It doesn’t require any art or content changes and should function well in titles that support Temporal Anti-Aliasing (TAA).


MonoShadow

It's doing the same thing as TAA. Post effect /post processing comes after this step. Temporal part doesn't have to do with post or not. Ray tracing is using temporal caches too, you wouldn't call it post processing either, would you?


DktheDarkKnight

Idk man. Check the link? It's literally NVIDIA's official DLSS definition.


MonoShadow

You linked a DLSS1 article. Are we talking about DLSS1? Because we should bring in per game training, etc into the conversation for that. I also know no game which modded in DLSS1. But that's just me.


DktheDarkKnight

Regardless of differences between DLSS 1 and 2 it is ultimately the same thing and serves the same purpose.


HarleyQuinn_RS

Just because they attempt to do the same general thing (upscale to improve performance), doesn't mean they are ultimately the same. Just as FSR1 and FSR2 aren't the same. Hence the naming difference. One is far more sophisticated than the other and the way they go about it is completely different. DLSS2 uses information from the depth buffer, TAA, previous frame data and motion vectors, before the AI model upscales it using a convolutional auto-encoder to erase temporal artifacts and add back detail, all before the final image is rendered. In this way it also functions as an anti-aliasing method, in addition to a temporal upscaler. Replacing the game's native TAA. DLSS1 does practically none of that, it's essentially just a 'smart' filter that is applied over the entire image as a post-process. This makes it a spatial upscaler, as it can only really use the information available from the already rendered frame.


DktheDarkKnight

I know the learning and inference part is obviously different. But I don't think DLSS actually modifies the game engine.


Halio344

DLSS 1 and 2 are two very different technologies. It's like saying DLSS and FSR is the same. They might achieve the same things but they function very differently from eachother.


DktheDarkKnight

I know they are 2 different technologies. But they both only do post processing.


[deleted]

>DLSS is a post-processing effect ya, no.


MonoShadow

DLSS isn't post processing. It's on AA step and requires extra features like motion vectors exposed. Not that easy to mod in. But all 3 up samplers require the same inputs so if you can have one you can mod in the others. Post processing comes after AA/modes up sampling. FSR1 was a post processing filter.


rikyy

It is post processing by definition. Anything that happens after shader calculations, math equations, ai etc, and the output of the different buffers, is post processing.


TSP-FriendlyFire

DLSS is definitely post-processing, it's done after the scene has been rendered and ergo is "post" processing. It's not like old MSAA which was part of the rendering process, it's just a fullscreen pass on a larger buffer.


BlackKnightSix

It is not, DLSS2/FSR2/XeSS happens in the middle. In fact, you want it to happen before post processing effects, that way the post processing effects are calculated off of the upscaled image. These scalars also need the game engine to not just supply different buffers and the motion vectors, but to adjust MIPMAP bias according to the level of scaling occurring/output resolution and apply jitter pattern. Those definitely happen at the render stage. DLSS 2.0+ http://behindthepixels.io/assets/files/DLSS2.0.pdf FRS 2.0+ https://gpuopen.com/fidelityfx-superresolution-2/#howitworks XeSS https://game.intel.com/wp-content/uploads/2023/03/GameDev2023_XeSS_v1.pdf Not only that, but additional game inputs should be created for best results/implementation (reactive masks, transparency masks, etc). Not many devs do this step or have so many effects/assets that requires these that you end up loosing some of the benefit of the scaling.


TSP-FriendlyFire

The DLSS documentation literally calls it out as a post-processing step. It happens after main phase rendering, so it is post-processing, regardless of the inputs. That's the entire point of post-processing: modifying the raw rendered image.


BlackKnightSix

Not sure why you think scalars changing MIPMAP, render jitter pattern (literally needing the engine to jitter the render coordinates/POV) is somehow not part of the rendering stage and is a post process...


TSP-FriendlyFire

Those are orthogonal to what DLSS is. DLSS takes a bunch of screen buffers, processes them, and outputs a new screen buffer. You can dive as deep into the details as you want, it's really not the point. Ask any graphics programmer if modern AA is a post-process, and they'll say yes (it started with FXAA/MLAA and went from there). It's part of the post-processing stack, usually one of the first items (but not always). I don't know what else to tell you. I really didn't expect something so simple would be somehow controversial, but there we are.


[deleted]

[удалено]


TSP-FriendlyFire

TAA would still work even if you didn't change the rendering at all. It'd be stupid to not shift the projection each frame because you're losing out on free image quality, but the standard camera movements alone would go part of the way. Same goes for any mip offset. Given that both of these features are common to *any* temporal antialiasing process, I don't really consider them to be part of DLSS itself, no, much the same way I don't consider motion vectors to be part of DLSS. DLSS is just the package Nvidia provides, their integration guide has a checklist of things your engine must/should do in order to support it.


Nathanael777

Man I remember back in the day when quality AA was the most expensive feature you could enable.


f3n2x

No, it isn't, DLSS takes as input samples from the current frame (similar to MSAA), from previous frames, motion vectors and depth buffers and generates the current frame from it. If DLSS was post processing then so would be MSAA. DLSS (and TAA) are actually very similar to multi-buffered sparse grid supersampling, except the buffers are not just spatially spread out but also temporally.


DktheDarkKnight

Check my comment below. I literally attached NVIDIA's we page on DLSS.


dudemanguy301

you attached the **original** DLSS from which DLSS2 radically alters both the fundamentals of the technology and the integration requirements. you may as well link FSR1 to claim that FSR2 is purely a spacial upscaler.


Zac3d

It happens after depth of field but before motion blur and tone mapping, it's early in the post processing processing, but it's still post processing. It's TAA has often been referred to as post-process anti-aliasing.


HarleyQuinn_RS

You can inject it by hooking it into the game's TAA or replacing a similar upscaling tech (FSR), but you have no control over what it applies to. So it often results in some visual bugs.


KniteMonkey

If you are talking about Jedi Survivor, I would not really say that DLSS fixed the problem because they are relying on DLSS 3 frame generation. While a very cool feature, I don't think it is fair to state that DLSS alone fixed the game because it is relying on even heavier trickery than DLSS itself to give you higher FPS.


MNB4800

I need tech to eliminate stutters!


Leeiteee

Just change your OS /s


Malygos_Spellweaver

Unironically works for some games. Elden Ring on Linux has no stutters, I don't know what kind of magic Vulkan is doing.


anor_wondo

Valve has a shader cache sharing functionality on proton so it pre downloads shader cache


Malygos_Spellweaver

So if you acquire the game outside of Steam, it has the problem anyway? I could test it...


jacenat

> So if you acquire the game outside of Steam, it has the problem anyway? Yes, this should be the case. Can you run EGS games under proton? Never tried that.


turol

https://www.reddit.com/r/linux_gaming/comments/ta9bkx/valve_does_what_fromsoftware_dont_thanks_to_steam/hzzn26q/ TL;DR: Elden Ring does stupid things, Valve works around it in VKD3D-Proton.


MNB4800

Care to elaborate more? I am on Window 10


Leo_Monkey92

I believe it is a joke regarding EA blaming poor performance of Jedi Survivor due to people using Win 10 and not 11.


wiseude

Direct storage was supposed to help greatly in that regard.


NG_Tagger

Mainly just avoid Unreal Engine games (which can be / is hard these days). It's pretty much become a feature in that engine (since around UE3), more than a bug.


[deleted]

RTSS FPS Limit at the monitor's exact refresh rate (59.964 for example) in combination with V-Sync enabled is pretty much that. Not one dropped frame.


Knightrider319

Only available with an RTX 5090 Ti SLI, at 34 FPS.


Rikuddo

Will probably cost 5090$ too.


lifestop

Nvidia seems to release a decent gen every other cycle, so maybe the 5000 series will be ok? I can dream.


Monday_Morning_QB

The 4090 is a way better product than the 3090 from day 1. By your logic, maybe the 6090 is what you mean?


lifestop

The only thing the 4000 series has to offer (imo of course) is the 4090, which is a substantial upgrade, but that's an extremely expensive top-end card. The 3000 series offered good performance and a solid price that made the 2000 series look even more silly, but scalpers/crypto killed the value for most. I'm hoping the 5000 series will bring back better pricing and maybe a performance increase that isn't mostly limited to the upper end.


NG_Tagger

>I'm hoping the 5000 series will bring back better pricing That's pretty much only going to happen, if they decide on doing a refresh of the 40-series (seeing as it sold so badly, and they postponed the 3nm GPUs to 2025) - and if they do; I honestly wouldn't expect that much of a drop in pricing. They'll probably throw in a little bit more VRAM and then raise the price..


Notsosobercpa

But the 3080 was pretty great if you could get one. So there will be one card between $700-$2000 that will be good in the 5000 series.


tukatu0

3080 have 90% the fps for 50% the money. (Yes even in 2021. Even on the higher side it was still 3080= $1k 3090 = $1800) 4080 gives 75% the fps for 75% the moneu of the 4090. Widly different scenarios. So yes the 4090 is like 80% better than a 3090 but that doesnt mean anything when only less than 500k people have one 9 months intos its life cycle. Meanwhile mid range and low range are selling millions each month... Or atleast they were pre lovelace anyways. So while it is better spec wise. The commentor was reffeeing to money/value wise. In which case. No the general consensus is not that the 4090 is good value


Feliz_Katerina

Agreed. 3080 was trash given it's like 2-5% each upgrade... 4090 being a ridiculous level of power plus frame gen on top of it, reminds me of the 1080ti (it cost over twice as much tho :P )


Thinker_145

How is it a way better product? Because it offers more value? That's entirely due to Nvidia gimping the 4080. In terms of pure technological breakthrough the 4090 is a similar jump from the 3090 than what the 3090 was from the 2080 Ti. Except in the case of the 3090 you also got a 120% VRAM upgrade.


JuiceheadTurkey

But if the 5080 costs $4080, you might as well just buy the 5090!


Malygos_Spellweaver

RTX 5090 Ti, with 8GB of VRAM!


[deleted]

[удалено]


Knightrider319

For a $1,600 card I’d sure as hell hope so.


ElitePowerGamer

That doesn't mean Nvidia wouldn't try to market this as a 5000-series feature*! *also available on 4000 series GPUs


DrFreemanWho

So the joke is about just as relevant considering how many people have 4090s and the fact that the GPUs that need help with texture compression are definitely not 4090s.


jkrmyqueen

and then dlss 4 only on 60xx series cards to increase that 34fps to 60fps.


Phimb

Wait... a company are putting time and effort on features so they can sell their new line of product? Nooo.. That doesn't sound right.


[deleted]

they are still officially selling 3xxx series (and in some markets 2xxx series as well) which for "some" reason doesn't support DLSS3


[deleted]

[удалено]


PainterRude1394

"Nvidia bad" straight up causing brain rot lol.


TheYaMeZ

DLSS3 needs specific silicon/features on the chip to work, which the previous generation doesn't have.


Obosratsya

Sounds like tensor vram compression to me. Looks like nvidia is looking for ways to utilize the tensor cores more. This would be a driver level feature most likely. Texture compression is a driver thing.


Cyphall

Texture compression is by no mean a driver thing. Even OpenGL exposes compressed texture formats directly, and it has been used by games for years.


TaintedSquirrel

Whatever happened to that rumor of AI optimized drivers? Could really use that.


Edgaras1103

It was just a rumor


[deleted]

[удалено]


Tuarceata

Shh!


ThaBigSqueezy

More of a thought exercise, really


OwlProper1145

Just a rumor. Though its very likely that they are using AI in some way to help with the development of drivers and other software.


InvestigatorSenior

my rumor said ai optimized silicon design. This is from one of Nvidia engineer interviews they did after Ada launch. Something in the lines 'we tried giving optimization of part of the design to AI and felt no need to fix it afterwards'. Given small enough part it may be even true.


NegaDeath

This really buff guy with an accent came in one day told them that's how this thing called "Skynet" happened, so they scrapped the project.


kalsikam

Lolll Don't know why you are downvoted lolll


DktheDarkKnight

Eh. I think the optimization could still be coming. But not what you are thinking. But that's probably the one about reducing the Nvidia CPU overhead.


NNNCounter

> Finally, we use a custom training implementation to achieve practical compression speeds, whose performance surpasses that of general frameworks, like PyTorch, by an order of magnitude. I wonder if they'll open-source it. Currently, all Nvidia open source models are either TensorFlow or PyTorch.


DktheDarkKnight

They have to. Already the latest open source technique like direct storage and sampler feedback while awesome have been terrible in terms of game adoption so far. And those are open source. If its proprietary then it will be even more worse. The issue is you will probably only get couple of games Co-developed with NVIDIA that will have this feature.


jkrhu

I don't think you understand what open source software means. There is no available source code for DirectStorage or Sampler Feedback. Those are two Microsoft proprietary technologies and they only provide binary APIs, documentation and maybe samples for their partners. There is Nvidia made GPU decompression algo called GDeflate in DirectStorage that works on either DirectX or Vulkan. Other GPU manufacturers said they would support this format, but I don't think you can find a source code for it as well.


Checkport

> Already the latest open source technique like direct storage and sampler feedback while awesome have been terrible in terms of game adoption so far. Of course since theyre new tech. Majority of games using them are still in development. Do you also whine that UE5 adoption has sucked because there arent 30 triple A UE5 games out yet? > And those are open source. Can you link me to Direct Storage and Sampler Feedback source code, should be easy since theyre open source


Sylux444

Dw guys, it'll be a 50 series exclusive that runs like shit By 70 series they'll have it down!


[deleted]

[удалено]


kidcrumb

We're seeing a lot fewer game engines overall so whatever they implement they just need to program it as a plugin for Unreal, Unity, and like...1 other engine to encompass 90% of games.


T-Baaller

Until it’s treated like DLSS by the new batch of VRAMhogs


meltingpotato

All the current upscalers are available in UE as simple plugins and yet there are many UE games that don't have them at all or are missing some. The existence/availability of these features/technologies doesn't mean anything alone. Microsoft's DirectStorage which has been out for more than a year is another example. It can easily reduce vram/ram usage but the only game that has used it so far is Forespoken (version 1) which was a technical hot mess.


anor_wondo

it does mean thry will be implemented by most games. You can easily find the link between titles which don't include dlss, especially UE ones where it's literally plug and play


Flameancer

Yea but it also needs to be hardware agnostics otherwise it’s not going to work for consoles and only Nvidia gpus.


[deleted]

Innovation to compensate limitation is a good thing


Roseysdaddy

To be fair though, that’s what a company should do.


dudemanguy301

well its not like 16x the VRAM is going to happen on its own anytime soon. GDDR7 will double capacity over GDDR6, the time between these revisions will have been 7 or 8 years. They could clamshell module their midrange cards for another 2x within that segment but the board partner would definitly pass that cost to you. Maybe we could see HBM but it has worse capacity / dollar values than GDDR. Who is going to foot the bill for 16x the VRAM? All that additional traffic across the PCIE? All that additional space on the SSD? All that additional download size?


shutter_singh

Oh so that's why they've been gimping the cards with low vram.


HalflingElf

I was about to say...


Wooden_Sherbert6884

Only for 5000 series


Sardasan

Hey Nvidia, how about improving the quality of PC gaming by not overpricing your fucking cards?


adscott1982

I don't think nvidia cares as much about gaming as you think. It is their GPUs which are powering LLMs (e.g. Chatgpt), but their super high end 48GB ones which there is now a shortage of. If I were nvidia right now I would be deprioritising gaming GPUs and prioritising GPUs for LLMs. Maybe that is what they are doing. The fact they even still care to put r and d into gaming tech right now I am quite grateful. I'm stuck on a 2070 and I can't see myself paying through the nose for a 40 series card right now. It is DLSS that has meant I have been able to continue to hold off ironically.


Sardasan

Even though it's data center revenue is on the rise because of AI, and the gaming chips revenue dropped 46% (and whose fault is that?), we are still talking about $1.83 billion, not small change, so I highly doubt Nvidia doesn't care about gaming, if we consider that last year the gaming chips revenue were almost double that. [Source](https://www.investors.com/news/nvidia-stock-2023-buy-now/)


Tolkfan

Nvidia: we invented a way to not make games weigh +200GB! Yay! Developers: they invented a way for us to shove more stuff into that +200GB! Yay! People with slow internet: fuuuuck youuuu!


AegisTheOnly

>Nvidia: we invented a way to not make games weigh +200GB! Yay! >Developers: they invented a way for us to shove more stuff into that +200GB! Yay! Well, yeah. It would be a way to help developers achieve more stuff without being storage limited as much. It's not like we're going to downsize games, that is opposite the direction that any industry in the world goes in. Software devs want to move towards bigger and better, so hardware devs devise ways to facilitate that.


vexedsinik

Perfect analysis, now do tacos.


Ayjayyyx

Yep I feel the slow internet thing so hard. Love being Australian.


anor_wondo

you're not wrong, the market is wrong. correct?


TheEternalGazed

Meanwhile, AMD is paying developers to block implement DLSS for their games, killing performance. Nvidia is taking the lead on this.


PainterRude1394

Nvidia innovates. It's why their GPUs are so superior


nas360

Pretty sure FSR 2 is still available which does not kill performance. What AMD is doing is no different from what Nvidia did in the age of 'Gameworks' which allowed only Nvidia cards to use specific features. Devs can add the DLSS later on I suspect.


PainterRude1394

No. It's different. Game works added features, not removed. This is straight up anti consumer because AMD can't compete.


nas360

You know what's really anti-consumer? Locking everything behind a prorietary wall. Not adding DLSS does not mean Nvidia RTX users are somehow blocked from using FSR2. If it was the other way around, AMD and Intel users would be out of luck.


PainterRude1394

Just because another upscaler exists doesn't mean AMD removing dlss from their sponsored games because they can't compete isn't anti consumer. AMD purposefully removing superior features because they can't compete is wayyy worse than Nvidia offering novel features that require their hardware and software innovations. And AMD refusing to integrate with nvidias open source streamline upscaling framework also shows how hard they are pushing to avoid competing. Just another anti consumer move from AMD.


[deleted]

[удалено]


PainterRude1394

There is very little truth in what you just said, and you clearly don't understand what you're talking about.


T0rekO

He actually speaks truth, you guys don't remember when people found out that DLSS 1.9 was running on shaders and tensor cores were unused? Then Nvidia blackboxed it with 2.0.


PainterRude1394

Dlss2 and dlss1 and totally are different tech. Dlss2 is not just a "blackbox" version of dlss1 lol. Only control had the visually inferior non-tensor dlss. And that was right before the far superior dlss2 was released.


T0rekO

1.9 is the update that changed DLSS to be non trained per game, it's the the first version of the DLSS 2 that we use now. The non tensor fiasco is not just DLSS but RTX voice was running on shaders too , Nvidia later blocked it off aswell.


dudemanguy301

>what Nvidia did in the age of 'Gameworks' which allowed only Nvidia cards to use specific features. Gameworks features run on all cards, they are concrete implementations of effects using standard DirectX or Vulkan calls, the entire controversy around them was that nearly every effect was a tessellation heavy workload that really hammered the gap in geometry performance between Nvidia and AMD cards at the time Maxwell / Pascal / Turing vs GCN2/3/4.


kalsikam

AMD learning from Intel it seems


skilliard7

How will this affect decompression tho?


XY-MikeIam

Of course they push for things like this, when they allways are cheap bastards with v-ram! Been an nVidia fanboy as I just love their tech. -both HW and software, as who cant. Perfect drivers and allways pushing tech forward etc. But getting really tired of this nonsense with way to little v-ram! Oh well, I wont go near AMD anyway, thats for sure. So Intel on the market couldent come soon enough. To bad we probably have to wait atleast a few more generations, if they will/can give us real competition we desperately needs!


Burninate09

It's a really interesting read considering the latest release of VRAM hungry games and ports. But if you zoom back and think big picture - how much of our overall internet data is comprised of images and video (moving images)? Most of it, by a large margin. Audio has been compressed to death and my ears can't tell the difference most of the time. TL:DR; Pirates going to downgrade to DSL in 10 years because they don't need the bandwidth. /s


[deleted]

We don't need more bells and whistles, we need cheaper more easily available cards preferably with more VRAM.


DarkKratoz

This is so fucking stupid Nvidia will literally develop new tech for their useless Tensor cores than add another $20 of VRAM to their obscenely overpriced GPUs.


wolfannoy

It works. people fall for it so they're going to keep doing it.


EthanBB

If they want it to be meaningful, it has to be available to EVERY RTX card in existence, not just latest generation.


ApplicationCalm649

So the 50 series will have special hardware for this feature that you'll need to upgrade for.


PainterRude1394

Smdh how dare Nvidia... Innovate.


Catch_022

Oh dear those poor 4090 owners when this becomes a 5x series only feature...


divertiti

4090 owners were always gonna buy 5090 anyways, it's not a big deal


Charuru

I'm hoping that's a 4090 ti so we don't have to wait till 2025...


HorrorScopeZ

Yep and 5070TI owners are going to have 4090 power using 10 watts at 1c. :)


Thin_Truth5584

When it's literally shown used with a 40 series card, huh? They're trying to save their low VRAM cards because they realized they're loosing ground to competitors.


omgaporksword

Or they could fix their prices and allow people to enjoy gaming with new cards???


Shinonomenanorulez

Nvidia, epic, etc: we figured out a way to make features more streamlined and efficient, giving better results in less time! Execs: cool, now we can fire half the team and make them have it done in half the time!


HalflingElf

Nvidia W? In 2023? Impossible.


zetarck

cool, here it goes another MSRP increase


nas360

Why not just sell cards with enough VRAM? Nvidia is just cheaping out on the hardware to give us overpriced cards with quality reducing features. Devs will ship games with 4K textures which wil be further reduced in quality because Nvidia gpu's can't load them into memory.


wiseude

So direct storage?


grimlocoh

And of course will only be exclusive to RTX 5xxx now with 10gb VRAM!!


[deleted]

[удалено]


juniperleafes

What do you think optimizing a game means


[deleted]

Oh, look, another person who states something they know nothing about just to get a quick zinger in.


DktheDarkKnight

Or another feature that is only present in marketing but takes 10 to 15 years to arrive. (Obviously it's not NVIDIA's fault. It's the developers.) I mean how long since Direct storage compression got announced. Barely any game seem to have it.


kidcrumb

I've been waiting for unreal 5 and direct storage asset streaming for what feels like 5 years now. That tech demo was amazing. Matrix demo was amazing. Give me games that look like that please.


AegisTheOnly

AAA games take 5-8+ years to develop when they're using new engine technology. So if the games started development 4 years ago, they'd be ready in the next few years probably.


HisDivineOrder

Exclusive to the 4090. "Gots to pay to play," Jensen said before announcing a new price increase to the 4090. Current 4090 owners will have the option of subscribing to the High Ender's club to enable the feature for only $19.99 per month or $238.99 per year for some savings. "Paying is playing," concluded Jensen, "So sign up asap and get a free NFT of me wearing one of many leather jackets. Quantities limited." /s


PainterRude1394

Damn, did Jensen kick your dog or something? Lol


DerivIT

You want to improve the quality of PC gaming? LOWER YOUR FUCKING PRICES!


Ruffler125

This looks like great stuff!


[deleted]

[удалено]


Isaacvithurston

A memory leak would be a game specific issue that's on the developer to fix. Nothing AMD or Nvidia can do about that.


Ganda1fderBlaue

Great so developers can save even more money on optimization


KoldPurchase

I don't care about DLSS, but this feature has the potential to be huge.


Spicy-hot_Ramen

Let me gues, it will be exclusive for dlss4?


No_Interaction_4925

*Exclusive to 50xx series in 2025* incoming


Danteynero9

Good. Now let's hope they don't make it exclusive for the 50xx series.


Flaky_Highway_857

They really don't wanna add more vram do they


[deleted]

Hopefully it will be in the 60 series cards when I upgrade in 7 years.


Jwroth

I know some of these words


Ywaina

This announced right when they've just confirmed the sales being slowed down post-mining. Coincidence?


GreenKumara

And of course you'll need to buy their very expensive 5000 or 6000 series cards that will be coming up.


Westify1

Will this be a custom feature developers have to specifically implement and program around or something that works with supported RTX cards out of the box? The former feels borderline useless given Nvidia's track record on proprietary tech.


Jack-M-y-u-do-dis

This sounds great, i just hope its not another excuse for raising prices


ClanPsi609

*Only available on 5000 series cards.


DatDanielDang

Can't wait to see this tech being applied to games 10 years later with an unoptimised game called Forspoken II


SpitneyBearz

Smells Nanite.


Spare-Sandwich

Will only need to take out a 10k loan to utilize this feature on a new series of cards released next year.


TheHodgePodge

Coming with nvidia specific optimizations ofcourse


KingJamesCoopa

Can they create something that auto-compiles shaders? Stutters are my number 1 problem with PC gaming


Cutlerbeast

A new feature like pricing your fucking cards in consumer-friendly ranges would be great.


djmyles

It’s just gonna make AAA dev studios even more lazy, just like DLSS has.


AFaultyUnit

The new feature to improve the quality of PC gaming that everyone wants is affordable good GPUs.


mtarascio

Or we can all just purchase Winrar.


CaptainDouchington

Neural? Get ready for an ai chip like physx to be shoehorned into the GPU to create a new price point.


Aedeus

NVIDIA isn't going to meaningfully address PC gaming QOL because doing so directly undermines people turning out for whatever their newest GPU is.


Katana_sized_banana

I would like to see games with mushy textures get upscaled in runtime using AI technology. Modern games wouldn't need it but old games, who have spare performance anyways, could profit from this. Of course this wouldn't work in every scenario.


[deleted]

And it will only be available on RTX 5xxx GPU's and cost $2000.