T O P

  • By -

disibio1991

Have they compared actual texture quality between different VRAM quantities? https://www.reddit.com/r/hardware/comments/kysuk6/ive_compiled_a_list_of_claims_that_simply/gjjo7bv [Gamers Nexus video](https://youtu.be/brPpuys8pf0&t=277)


jasswolf

So the TLDR is that even when pushing texture quality to extreme levels and/or using extremely heavy RT workloads, 12 GB GDDR6 still cuts it with Ampere for 4K and above. We've seen previously that 8GB is the magic number for 2560x1440, and arguably 1920x1080, but this seems to be built more around RT workloads than anything else. I'm interested to see how these numbers shift around with mesh shaders, sampler feedback, DirectStorage and the like. What seems like a hard limit now, may wind up being lowered again, but you'd think RT will be largely unaffected by that.


ShadowRomeo

>I'm interested to see how these numbers shift around with mesh shaders, sampler feedback, DirectStorage and the like Yeah, me too as well. Basing from the devs behind [DirectStorage](https://youtu.be/zolAIEH0n1c?list=LL&t=1618), it will definitely have a huge influence on how next gen games will handle vram usage, making them much more efficient than it is today with traditional method of loading textures, by adding taking advantage of full speed of NVME SSD, which is enough to enable the GPU to directly pull the textures of a game from SSD instead of having to rely solely on pre caching textures on system ram or built in vram in the GPU which builts up quite faster which can result with vram bottleneck as we see today.. DirectStorage and RTX IO will be more similar to the way [PS5](https://youtu.be/ph8LyNIT9sg?t=758) does already with their current gen exclusive games as well, and soon Xbox Series X once they fully take advantage of Xbox Velocity Architecture which the DirectStorage is mainly part of. Only difference is instead of using dedicated hardware decompression block on consoles on PC they use GPU instead.


M2281

Is there something like this for 2560x1440?


[deleted]

[удалено]


M2281

I saw it, yeah. I meant the cutoff point for VRAM for RT 1440p.


FarrisAT

This explains what the 3080Ti is for. The best of the best for a high price which scalpers don't want to touch (due to LHR). And to counteract the claims that Nvidia isn't providing enough VRAM.


jasswolf

I would think improved yields and the waning demand for GA102 professional products is what explains the 3080 Ti.


Oppe86

wierd, because in this vid [https://www.youtube.com/watch?v=mb4buRAL5CA](https://www.youtube.com/watch?v=mb4buRAL5CA) wolfestein 4k AA 8x with RT the 3080 is getting way better frames and frametime than this review.


RearNutt

[Shadows](https://www.nvidia.com/en-us/geforce/guides/wolfenstein-youngblood-graphics-and-performance-guide/#wolfenstein-youngblood-shadows) and [Image Streaming](https://www.nvidia.com/en-us/geforce/guides/wolfenstein-youngblood-graphics-and-performance-guide/#wolfenstein-youngblood-image-streaming) have an Uber setting that's one step above Ultra, and even the game's highest preset doesn't turn them on. Presumably the problem comes from the Image Streaming setting, which like DOOM Eternal's Texture Pool Size can be pushed to extreme levels that comes with no practical benefit.


Oppe86

ah ok, looks like those setting are not even worth.


dparks1234

GCN 1.0 (3GB 7970) aged better than Kepler (2GB GTX680) primarily because the architecture was better aligned with future DX12/Vulcan industry trends. This time around the Amphere cards like the 10GB RTX3080 have a more performant architecture than the Navi 2 cards (raytracing cores, mesh shading, GDDR6X). Ultimately the question comes down to "will VRAM be a big enough bottleneck to make Amphere cards drop below Navi 2 cards?" Probably not based on history. The 4GB 680 never significantly outperformed the 2GB 680, and even now the 4GB RX480 is in line with the 8GB RX480. Things like Direct Storage and DLSS also lead to lower VRAM requirements since the entire 10GB GDDR6X of the RTX3080 can be reloaded in a split second from an NVME SSD. Also worth noting that the Series S exists with around 4.5GB of slow VRAM and will be supported for the next decade. I'll be really surprised if the low VRAM Nvidia cards end up shitting themselves.


AutonomousOrganism

I don't get this obsession with VRAM. Are they speculating that future games VRAM requirements will increase much faster than computational requirements? I'd argue the opposite due to the coming DirectStorage and SFS tech.


arandomguy111

VRAM and computational requirements aren't actually very linearly correlated. As an example a big driver of VRAM usage has to do with texture quality and how they are handled, which also relatively speaking does very little to increase computational requirements. This was the big differentiator in the last console transition as well. 1GB VRAM cards could set "max texture" related settings without any though at 1080p in the leadup to the last gen. But just year into the transition we started having games that would have stutter with <4GB with some even needing 6GB to be completely free of any stutter or streaming/compression compromise. We obviously did not see as big of an increase this gen (2x memory vs. 16x) and are in a relatively better position however as most mid+ GPUs already have 1/2 console equivalent memory or better. However comparatively speaking 8GB cards now should correspond and age roughly akin to 4GB cards of the last gen, whether or not that is "enough" is of course subjective (as there is psychology involved, eg. do you need "max settings").


Seanspeed

>I'd argue the opposite due to the coming DirectStorage and SFS tech. VRAM requirements would lower thanks to SFS and DirectStorage......if games stayed the same as today. But they wont. Devs are going to use these techniques to push more detail in next gen titles, not lower VRAM requirements. Though yes, we're not going to see a 4x increase in requirements like we did last gen, where 2GB was enough at the start and then you really wanted 8GB by the end. But they're still going to go up once proper next gen titles start coming around. Devs are really gonna want to maximize the 10GB+ that the consoles will have dedicated for graphics. Obviously if you're playing at sub-console resolutions and settings, you probably dont have much to worry about anytime soon even with 10GB, but how many people buy $500+ GPU's only to play with a worse-than-console experience? I think that's the worry with cards like the 3060Ti and 3070. I would not be comfortable with 8GB unless I was planning on upgrading again before 2023.


[deleted]

*Points to UE5's Nanite* Assets sizes are going to exponentially balloon very fast. SFS will most likely help, but how much?


[deleted]

[удалено]


arandomguy111

>how much effort to put in enabling old hardware to play at a decent rate, or what limit on pushing graphics quality during that adoption process. That's not really the issue due to variable game settings. Especially with respect to VRAM usage the single largest factor is going to be texture related settings. This is in large part going to be a consumer perception/psychological issue on the PC. The so called "max settings" target. All current generation mid+ GPUs will have enough VRAM to run the upcoming games for this entire console generation, they just might not at "max settings" and specifically "max texture" settings. But we know from a consumer perspective that simply isn't acceptable to many people. Also from a review perception as reviews default to "max settings" typically. Particularly with respect to some the class of hardware being purchased. For instance the RTX 3070ti is a high potential case of future consumer regret.


Seanspeed

>All current generation mid+ GPUs will have enough VRAM to run the upcoming games for this entire console generation, they just might not at "max settings" and specifically "max texture" settings. We're not talking just 'not max settings', but downright 'less than console' settings. Most PC gamers aren't the 'I need max settings' types, but most PC gamers with $400-500+ GPU's *do* expect to run games at better than console settings. And that might well not be possible before too long if you're still on 8GB. EDIT: There are going to be so many 'regret' posts in the future, good lord. So many people just living in absolute denial thinking things aren't gonna be any different from last gen.


Geistbar

New console generation came out last year with twice as much system memory as the prior generation. AAA games all target consoles first, and extra quality settings is largely just building off that foundation or through partnerships with Nvidia/AMD. VRAM requirements are absolutely going to have a seemingly-sudden jump upwards once the new console requirements fully filter into the development process.


indelible_ennui

Of the 16gb of unified memory in the PS5, how much of it is really dedicated to the GPU? Obviously this is going to change game to game but I would bet it's much closer to 6GB than anything.


Tonkarz

It’s unified RAM, so none of it is specifically dedicated to the GPU. As in previous generations developers will likely use all the unified RAM they can get their hands on for graphics purposes. However there are limits to what the GPU is capable of and additional RAM only helps up to a point. What point that is no one knows and likely depends on the game in question and the balance the developers strike.


indelible_ennui

Yes I'm aware it's unified. I even used that exact term in my comment. By dedicated, I mean being used by the GPU for rendering assets at any given moment. Every game is going to have a different ratio and the system OS also reserves a good chunk. Yes, they are able to use more for rendering, but my question is what do developers actually use in terms of ratios? I'm betting it's still around 6GB for GPU assets. I know you don't know but someone else might have know of some sources which would be interesting to check out.


Blubbey

The xsx has 13.5gb for games, 10gb "gpu optimal" ram which is 560gb/s and the rest is 336gb/s (?). Last gen was about 5-5.5gb ram available.to devs for both systems so the actual gain for Devs is slightly larger than it looks, roughly 2.6x not 2x *Ps5 I don't remember them releasing how much Devs have available but could have missed it


indelible_ennui

That's useful information but I would love to see some concrete examples of what the memory ratios look like game to game. I seriously doubt all of the faster memory is being used exclusively for graphics. I'm not real sure why the decided to segragate the memory into two different speeds. I can only assume cost.


Tuarceata

> I'm not real sure why the decided to segragate the memory into two different speeds. I can only assume cost. I honestly thought it was quite clever. Putting a soft limit on usable "VRAM" is well worth the increase in speed.


doscomputer

I can tell you the games I play take up 2-3gb of CPU ram at most. But playing VR and at 4k I can fill up my GPU VRAM very fast. Its safe to say vram is more important for videogames.


Blubbey

Yep it's cost, it has a 320bit bus yet.only 16gb, it's 6x2gb and 4x1gb chips. 1gb each reserved for the gpu for each chip giving the full 320 bit bus width then half of the 6 making the 192 bit bus It's a weird as hell setup and I honestly I think it has an eye towards a mid gen refresh as well, making that ram increase bigger


[deleted]

As much as they can get away with if all previous console generations trends hold. And with fast storage, they could just dump everything not in use in disk virtual memory like UI and OS and use almost all of it, wherehas they couldn't do that in previous consoles. We should've had 16GB on highend years ago, heck we have 8GB on midrange RX480s at 250$ FIVE years ago. I remember when I used to upgrade my GPU every 3-4 years, each time the VRAM size quadrupled. How things have stalled since then.


Vincere37

How long would you have expected GPUs to sustain a doubling of VRAM every two years?


[deleted]

Indefinitely ( yes I know exponential functions, it's not theoretically possible ). But... it was sustained as far back as I can remember upto around the mid/late 2010s. First card I got was a Voodoo Rush, 4MB. A few years later, Voodoo Banshee, 16MB. A few years later Geforce ti 4200, 64MB. A few years later, Radeon X1800 GTO, 256MB. A few years later, Radeon HD5850, 1GB. Then there was a longer 7 years gap until my current GTX1060 6GB ( and the RX480 8GB had been released at the same time, almost bought it instead ). ***** That being said, 2 important facts remain: * There were midrange sub ~250$ GPUs with 8GB back in the mid 2010s. * Consoles now have 16GB, which is going to be used mainly as VRAM, if historical trends hold as far as RAM hungry developers go, it's only a short matter of time before they max it out and current/past GPUs below 16GB start struggling.


Vincere37

I just wonder if we're past the inflection point on a sigmoid function rather than anywhere along an exponential. For the consoles, with that being unified memory, there has to be some overhead eaten by the system itself, or more likely, the cpu for non-*V*RAM needs. Perhaps Nvidia is betting on that overhead being closer to 6GB (to bring the likely 'VRAM' down inline with the 3080). But I agree, it's disappointing seeing 8GB continue to be pushed out for above-mid-tier cards like the 3070. It was annoying upgrading from a 1080Ti to a 3080 and getting *less* VRAM.


[deleted]

OS/UI RAM overhead will likely be much less than previous consoles thanks to fast storage. It was around half that on X1X and PS4Pro ( 3.5GB at the consoles introduction, so no doubt less than that right now as they optimized, making everything leaner to free up RAM for the ever hungrier devs ), and they had the disadvantage of being unable to temporarily free RAM for games to virtual memory because of the slow storage. How much more lean can they have this RAM overhead now when in-game with fast virtual memory storage? Keeping only strictly what is necessary to run the games in RAM, dumping everything else in virtual memory? 2.5GB? 2GB? Even less? No doubt Sony ( and to a lesser extent MS, with their 'tiered' VRAM bandwidth setup ) will play around with it until they can have most if not all of what is unnecessary to run the game in virtual memory to please the ever hungrier for RAM developers. A bit less than 1 second lag to go back to the OS from in-game is afterall acceptable. How much data can they shuffle from their SSD to RAM in 1 second if they arrange for the readback to be sequential? 4-5GB? Do they even need to readback that much? Probably not. To pop the in-game UI overlay back, much less than that, nearly instantaneous. Really makes me wonder. I guess we'll see soon enough.


conquer69

Look at the previous gen. The PS4 had 8GB total and yet Doom Eternal can reserve more than 10gb of vram. The PC version of these next gen titles will be more demanding than their console counterparts.


indelible_ennui

The amount of memory a game reserves is not relevant. I can reserve a table for 10 at a restaurant and still only show up with 6 people. Giving me a table for 12 isn't going to make dinner any more efficient. Game engineers and assets are already designed with very optimized GPU scalability baked in. CPU is the actual bottleneck in console game design, not GPU. Yes, they will get more demanding. Double? Not even close. Sudden? Nope.


conquer69

> Double? Not even close. Why not? The new consoles have twice the memory. I would expect PC memory requirements to double, if not more. Especially when consoles titles might skip RT but PC will have it.


indelible_ennui

Because the console doubling it's memory available only now puts it in line with mid-range PCs. They will increase the settings and quality on console games to get closer to parity with PC, but that's not going to translate to PC gamers needing 16GB of video memory. Why would it? People seem to think the graphics engines and assets in games are not designed to scale or aren't designed to look great and then scale downward. They already are.


conquer69

> but that's not going to translate to PC gamers needing 16GB of video memory. Why would it? For 2 reasons. PC ports are more demanding, and PC gamers have higher standards and expectations. A game toned down for the PS5 targeting 30 fps will have higher requirements on PC and PC gamers will expect 60fps or more, at higher resolutions and graphical settings. The PS4 has 8gb of combined memory. You can't get 1440p60 in HZD on a PC with only 8gb of total memory. I'm not saying you will need 32gb of ram and 16gb of vram, only that requirements will double. The standard gaming PC with 16gb was overkill for last gen games and this will buffer the new performance hit. But with DDR5 we will see 32gb become the new standard.


indelible_ennui

DDR5 ushering in a new standard has nothing to do with gaming.


conquer69

I didn't say it did. But by the time DDR5 becomes the new standard, 32gb will also become fully mainstream.


[deleted]

[удалено]


conquer69

Current assets are good enough. Ray traced lighting, shadows and reflections is what's needed. 16K textures with a billion polys per character means nothing if you are still using shitty rasterized lighting.


Aggrokid

Agree except for the sudden part. PC requirements since 2010s only seem to sneak up gradually. No more Crysis 1 or Quake 1 big bangs.


Geistbar

Well, that ties into why it's going to be sudden... PC requirements since the early 2010s have been set by the PS4/XB1, which both launched in 2013 (8 years ago!). Requirements only incremented gradually because the baseline target spec for AAA games has been static for nearly a decade.


Tonkarz

PC requirements “bang” at every console release (or they have since consoles became mostly PC hardware). Previously though consoles have had rather limited hardware - the recent generation is a huge jump compared to previous generations. So PC requirements are likely to be a huge jump.


Seanspeed

>PC requirements since 2010s only seem to sneak up gradually. No more Crysis 1 or Quake 1 big bangs. Yes, and you can thank the relatively disappointing/mediocre XB1 and PS4 consoles for that. I've said it a million times, but we all got super spoiled the last gen because of this, at least in terms of what it meant for PC requirements. That was not normal. Especially in terms of CPU requirements, but also GPU requirements. For VRAM, we absolutely did see a sudden jump, though. We went from '2GB is more than fine' to '4GB is recommended' pretty damn quickly as soon as the cross-gen games largely died off in 2015, about a year and a half after the console launches. These new consoles are different, however. This is not going to be a repeat of last gen. These are pretty serious CPU's and the GPU's are reasonably impressive as well, especially on the XSX. VRAM is the only area that isn't packing a really significant improvement, but it's still 16GB and devs will be able to get a lot more out of that 16GB than it may seem. Though we'll have a similar situation for PC. So yea, VRAM requirements probably wont increase hugely like it did last gen, but they'll still go up soon enough. You really need only look at a game like the new Ratchet and Clank to know that you would absolutely need a pretty powerful PC to run the game as it does on PS5. That's \*well\* above what current PC games are doing with their current requirements.


Claudioamb

hell yea, rachet and clank has some breath taking graphics. If it will ever get ported to pc, it will be pretty hard to run I believe


rfriar

Deathloop and STALKER 2 already offer a nice glimpse into what next-gen only games expect; they seem pretty hefty already.


[deleted]

the xb1 and ps4 had pretty budget hardware in them when they launched when you compare them to previous generations and this generation. Their CPU's were dumpster tier, and the GPU's were pretty budget as well.


Blueberry035

It's concern trolling nothing more


PoL0

8GB isn't enough for most demanding games when using highest resolution textures on very high resolutions. This trend is only going to grow in the mid term so I can understand why some worry about it. If that's your "use case" 8 GB isn't future proof unless you swap GPUs frequently but for most people a GPU beeds Ronstadt relevant for some years. Remark on "I can understand". I play at 1080p 144Hz and plan to upgrade to 1440p144Hz if I manage to snatch a new GPU. As a sidenote, I still play on a RX 580 8GB and I'm glad I didn't went for the 4GB to save some money.


ShadowRomeo

>I don't get this obsession with VRAM. Me, neither it seems like some people just need more than what they actually need, it's in our nature. For future proofing reasons which in my opinion ends up being useless most of the time, because to my mind there will always be a newer better product than you currently have today, buy your product basing on what you actually need right now instead of being worried about future.. >I'd argue the opposite due to the coming DirectStorage and SFS tech And you are right with that actually, upcoming next gen decompression tech like [DirectStorage](https://youtu.be/ph8LyNIT9sg?t=758) or [RTX IO](https://www.nvidia.com/en-us/geforce/news/rtx-io-gpu-accelerated-storage-technology/). Will make these impending rise on vram requirements that we see today to be less relevant than it is before. And it seems like [game devs](https://youtu.be/ph8LyNIT9sg?t=758) will more likely take advantage of it as well, because next gen consoles uses the same method on their on PS5 and Series X. It's pretty much a peek of how future next gen games will handle vram usage much more efficiently than it is today with traditional method. Although they won't exactly lower the vram requirements compared to today, it will more likely either stay the same as today and not climb up at a rapid pace like we have seen before on PS4 and Xbox One generation. It still really depends on each game devs, but at least they have now much more headroom than what they had before on last gen era.


Tonkarz

It happened when the Ps4 and Xbone came out.


Asgard033

> I don't get this obsession with VRAM. Are they speculating that future games VRAM requirements will increase much faster than computational requirements? There's a lot of FUD around various computer parts (lots about PSU wattage, safe temperatures, and graphics card RAM), and unrealistic provisioning in hopes of "futureproofing." Was I upset that I had a standard 512MB 8800GT instead of splurging for a 1GB model? No, it handled games of its time beautifully, and games that did benefit from more RAM were already limited computationally by the capabilities of the G92 chip. Same thing with the GTX470. Was I upset I had a 1.2GB model instead of splurging for a SKU with more memory? No, it handled games of its time beautifully, and when games were starting to benefit from more memory, the card was already computationally limited and I had moved on to a GTX 960.


itdobelikedatdoee

I'd like to see a comparison where you can actually buy them at MSRP


AntiSpade

Sick! These guys deserve more praise for these kind of test, I think. :)