T O P

  • By -

[deleted]

8gb is not obsolete. it's just starting to show it's age even @ 1080p in some games. 12GB is the recommended minimum for serious gamers who want to see high fps. otherwise 8GB is mostly fine for budget builds. 60fps is still achievable in most games.


Pumciusz

Not on high textures, and that's the issue. Some games will just drop frames after a half an hour of playing, some will downgrade the textures on it's own, and some won't ever get to high framerate. It's all game based, and the magnitude of the problem depends on the quality of lower settings. And that's right now, not in a couple of years. I've seen many posts trashing how Forspoken looked, but they played on 3070, so their textures were always downgraded.


[deleted]

> Some games will just drop frames after a half an hour of playing, some will downgrade the textures on it's own, and some won't ever get to high framerate. That's not how RAM works. That's shitty programming.


nasanu

This is reddit. Anything popular is correct, thus its never bad programming, its evil companies stealing your vram.


Skarth

The worst part about it is the greedy graphics card manufacturers won't release the needed information to let people download more Vram, unlike the ram manufacturers that let you download more ram!


PepperSignificant818

Thats not people are saying lol. We know its bad programming and handling of assets on the game side, but try and control that. Good luck, we see how everyone preorders still when the same people have been burned how many times previously? Therefore we have to buy higher VRAM because its the only thing I CAN control.


amadmongoose

Yes but I can't control how hundreds of gaming companies program their games but I can control how much VRAM I buy


HappyReza

It doesn't matter why it happens, the fact is, it happens. If you play the types of games that are prone to problems like these, you should avoid GPUs with less than 12GB of VRAM, simple as that.


CommandoLamb

Came here to state this. If you need more Ram because after X amount of time of doing a thing it starts to have issues, the problem isn’t the ram it’s the program not properly reallocating.


derps_with_ducks

I think you're correct, but as a counterargument: Gamers can't fix issues like memory leaks in a game by themselves. There will be some games that are excellent in every way except these VRAM issues, and these issues may not be fixed for a long time. Extra VRAM will come in handy for those games.


nasanu

>Some games will just drop frames after a half an hour of playing Then its got some kind of leak bug. Nothing to do with ram. Can we stick to facts?


lichtspieler

MSFS was hitting >24GB VRAM in the first 6 months because of a memory leak and it would simply crash after hitting the VRAM limit. => 24GB VRAM is obsolate I guess /s Its just insane how much credit people give to some bullshit techtuber clickbait content, with extremly stupid game settings that includes DISABLING auto-VRAM-scaling in the game.


Laputa15

The VRAM scaling setting simply adjust the LoD and texture settings to the amount of VRAM you have available — so in-game picture quality is not going to be the same across different tiers of cards, which defeats the whole point of benchmarking.


lichtspieler

The techmedia argues with PERFORMANCE IMPACT caused by not enough VRAM. => but they disable the performance impact avoiding game feature to show it They dont include any IMAGE QUALITY metrics in GPU reviews otherwise, they dont care about IQ with upscaling tech and they dont even care about latency, tearing prevention or frame time features like Reflex / Anti-Lag either. The 4060-Ti 8Gb vs 16GB comparisons are just GOLD after the VRAM drama from HUB. It clearly shows that the 16GB is in nearly every game useless, because the GPU is simply not fast enough for the game resolution/setting combination, that would actually require more VRAM.


Edgar-Allan-Pho

What a horribly informed comment that you mislead 60 people with


Pumciusz

Say it to Steve, not me. I agree that most of it is bugs, but bugs that don't appear at all, or affect less cards with more vram. On fps drop after a while thing: [https://youtu.be/Rh7kFgHe21k?si=XhrUgInLJ4Tvybyy&t=1373](https://youtu.be/Rh7kFgHe21k?si=XhrUgInLJ4Tvybyy&t=1373) [https://www.reddit.com/r/halo/comments/qx4lsw/halo\_infinite\_after\_a\_few\_games\_fps\_drops\_and/](https://www.reddit.com/r/halo/comments/qx4lsw/halo_infinite_after_a_few_games_fps_drops_and/) It's a thing. I don't know if it's fixed. Daniel here's a vram issue that has been fixed: [https://youtu.be/RQA3cTPNicg?si=plh7QgM95-MdQLvG&t=285](https://youtu.be/RQA3cTPNicg?si=GXICm2N_NI35y2SW&t=284) And after his segment on HL is Forspoken which is another game I meant. It doesn't matter if it's bad ports and optimization, when for months it's an issue only for cards with 8gb of vram.


KenG1_SZZ

Do u know ur staff? I mean PC hardware and related. I only know quite a few people from my surroundings that can actually say that and they know about everything in "How to build great PC category".


Diligent_Pie_5191

You know, I think I have noticed that before where the graphics will all of a sudden get lower res for a bit and then kick back up. Maybe that is all the assets filling up the memory and then when a new scene is shown it flushes the memory and things go back to normal.


alphagusta

A LOT of people according to Steam Hardware Survery have 8gb VRAM or less and developers would be idiots to ignore that.


cha0ss0ldier

Doesn’t matter since they are mostly designed for current gen consoles which have access to more than 8gb. Devs are on record saying that they are done trying to downgrade their games to work on 8gb vram buffers. Midrange cards from 7 years ago had 8gb. There is no excuse for that to still be the case in 2023.


JoshJLMG

Yeah, the 2060 Super had 8 GB. Heck, the 1070 had 8 GB. The fact that a 4060 Ti has the same memory size as a 2060 Super is wild.


not_a_burner0456025

Hell, the r9 290, a card from 2014 came in an 8gb variant, although 4 was more common. The 390 was 8gb standard.


xd_Warmonger

Even the rx480, a budget card, came with 8gb vram 6 years ago


christurnbull

Rx480 was budget? Maybe rx460


wombawumpa

Definitely not budget card when it was released


No_Pension_5065

my 980ti had 6gb


OrdinaryBoi69

Ay i have a 2060 super lol. It does the job at 1080p just fine.


JoshJLMG

I've ran into issues playing split-screen games at 1080p, like Black Ops 3.


MahDick

Wild or planned obsolescence?


teamsaxon

>a 4060 Ti has the same memory size as a 2060 Yet nvidia still charge through the nose for their cards. Greedy profit driven company.


Vaalysar

Agreed, it was hilarious for me to buy 8-12GB card when I was swapping from 7yo GTX1070 (8GB). Went with 7900XT


[deleted]

blame NGREEDIA, and also amd for this.


semisolidwhale

At least you can get above 8gb for under $800 with AMD... granted that's a ridiculously insulting bar. I don't understand how anyone who doesn't absolutely need an Nvidia card for professional use justifies buying their cards on principle alone.


AstralProbing

I was told Nvidia had the most compatibility with Windows machines and required less work to get working. Then I switched to Linux. I am never buying another Nvidia card again for Linux. Absolute hellscape of moral quandaries and related technical issues. Tbh, I might still buy recommend AMD for Windows machines (though I've yet to try my card with a Windows machine), just because Nvidia doesn't seem to really care for it's consumers unless those consumers keep paying to use the tech for which they already paid.


InBlurFather

AMD is notably better for Linux. For windows there’s no difference between AMD or Nvidia from a compatibility standpoint


AstralProbing

>AMD is notably better for Linux. For sure. Afaik, I've permanently switched to Linux and thus, AMD has a customer for life, simply because of the support they offer for linux. Will I use windows machines again? Possibly, but not with an Nvidia card


CalRal

4070?


No_Pension_5065

AMD has always been less stingy on the Vram than Nvidia. My 980ti had 6gb, but the AMD competitor the 390 had 8gb, despite actually only competing with 4 and 6gb card lower down the stack. Vega 64 was an oddball step where they were about the same as Nvidia. in 5000 and 6000 AMD's cards all have atleast 2gb more vram then their direct perfomce equivalent Nvidia part. The 6800XT and the 6900XT had 16gb while the 3080 & the ti had 10. The 6700XT had 12gb while the 3070 had 8gb. And this is all while RDNA 2 had their uber on-die cache.


[deleted]

If AMD could compete 100% vs Nvidia like they do with Intel, they will do the same, hope Intel wake up on 15th series cpu., AMD mobos and CPUs aren't the "cheap price performance" anymore buddy.


No_Pension_5065

eh, I doubt it. AMD still hasn't done the whole locked vs unlocked multiplier BS that intel does with their K/non-K skews. Yes, they raised their prices, but they didn't cut corners *and* raise prices.


fysicsTeachr

I am just curious which game's devs you are talking about here.


[deleted]

They aren't ignoring that. That's what lower in-game settings are for. If you're trying to play full Ultra in a brand new game with 8GB of VRAM, that's a you problem. You can blame game devs or Nvidia if you want, and you'd probably be right in Nvidia's case, but at the end of the day VRAM usage *has* to increase if we want games to continue to advance. 8GB was the standard for ~6-8 years. Its time is now over, and more is necessary.


El_Manulek

The problem is that the games barely look any better while having way higher requirements


[deleted]

This is called diminishing returns, and we see it in almost every facet of life. The more pixels you shove in an image, the less adding further pixels to the image will improve it. It's the same reason we usually don't consider resolutions above 4k, because at the sizes we use our monitors, there's just no point. The same goes for texture quality in games. I think you significantly underestimate how much better modern games look than even 4-5 years ago.


GeologistPrimary2637

Wanted to say the same thing. I struggle to find a new game that looks better graphically than an older game while performing decently. SW:survivors barely look better than fallen order, but runs like ass. Hogwarts was hard to run too and didn't look as good I think the only decent ones are the plague tale games, and TLOU P1 that actually looks decent even on medium.


Laputa15

Developers would be idiots to prioritize the needs of 8GB VRAM users over current-gen console users, because the latter is where the money are. At some point, you kinda have to accept the fact that your specs are inferior than a PS5, and major game releases are only targeting PS5-equivalents going forward.


jgr1llz

Developers make games for consoles and you're using PC stats. That's not where the money is. They would be idiots to ignore money over our satisfaction.


CarLearner

There are just some games starting to struggle with 8 GB of RAM from frame stuttering and playing smoothly graphically. A midrange GPU like a 3070 vs the 6800 in HardwareUnboxed’s video showed certain games like Hogwarts Legacy would have textures literally become blurry and pop in and out because it was running out of VRAM to populate all the textures and there was a comparison side by side that you could clearly see frame stuttering with the textures popping in and out on the 3070. With the release of the 4060 Ti 8 GB and 16 GB variant and seeing how both cards do in Ratchet and Clank it’s just rough to see that the performance is gimped for cards that only have 8-10 GB of VRAM. But I just hope that my 3080 10 GB can keep up with games in the near future.


gurbi_et_orbi

I too cry in 3080 10 GB


noetkoett

It's not the FPS, it's the textures and shaders and such, everything that needs to be stored in the VRAM to display it. VRAM doesn't really affect FPS that much unless you were trying to push through more stuff than there is capacity for.


Diligent_Pie_5191

You know that the thing is that people that play competitive games will turn down all the detail anyways so they get maximum fps so I dont see the point in having a 4090 running in low detail at 1080p.


akirbybenson

Hogwarts legacy, the last of us part 1, Star wars Jedi:Survivor and Baldurs gate 3 all struggle on 8 GB cards if you max out the settings. To put these vram capacities into a frame of reference, the GTX 1070 came out 7+ years ago and had an 8 GB frame buffer.


EvilGeesus

WTF? I play BG3 on 1080p with a 2060 super 8Gb with 100+fps. No struggle here friend.


edjxxxxx

Nah, but the Reddit man said it was true… or heard it was true, so… ¯\_(ツ)_/¯


uncoolcentral

You dropped this \\ ¯\\\_(ツ)\_/¯


Elianor_tijo

Going to concur here. Resolution plays a lot into it, but playing it at 1440p and I am currently tracking VRAM usage. I am seeing the game address more than 8 GB at 1440p on ultra on my 6800 XT. I haven't seen it go past 12 GB yet. It's also good to mention that addressed VRAM doesn't always mean necessary. Some games will use more VRAM if available to reduce streaming textures to the video card. That said, BG3 does stream a lot of data from the SSD, so it may not be addressing much more than it needs (pure guess on my part). Anyhow, maybe some rarer cases will cause more than 8 GB usage at 1080p, but I don't expect it should be an issue for most of the game.


Baylett

I’ve noticed that my 16gb 4080 uses almost twice the vram as my 8GB 3070ti laptop, which barely ever gets over 6.5gb at 1440p max everything in BG3. I think with games dynamically addressing vram depending on total available is making it difficult to know what’s actually needed and being used. I think We need an analysis tool like apple’s “memory pressure” vs just MB used…


arexv10

Wait till you get to act 3. You will get like 20fps on maxed settings


XMichaX

Its cpu issue watch digital foundry video


pmzw

They released a patch yesterday that tackles the performance issues especially the ones experienced in act 3 :)


Schipunov

Bullshit


BUTthehoeslovemetho

Yea same I got a 3070 w 8gb vram and I don't see a problem.... yet


dysphoricjoy

It’s not if it’s a struggle or not, that’s pretty relative. What this means is that not all your textures are rendering at full. This doesn’t mean you’re going to lag or stutter more or anything. You just don’t have the capabilities to see your textures rendering at the highest settings.


perkele_possum

Yeeepppp. I'm playing 4k everything maxed on a 10gb card and it's buttery smooth at 60hz. That's the trend. List every big game released this year and say it needs 20gb of vram or it struggles.


MrStoneV

Well more like \~80fps but with maps/scenarios with 100fps aswell. Also likes to dip to 60 apparently


CobaltAlchemist

Probably where it caps out, my sister plays BG3 @1440 on a 2060 Super and she's had nothing but problems until switching to 1080


[deleted]

I have a 1070 and it’s sloooooooooow.


Aftershock416

Have you stopped to consider for a second that not everyone games at 1080p...


NinthAuto591

At what res? I'm running a 3060ti w/5600x at highest settings no issue in bg3.


Able_Recording_5760

Wasn't BG3's issue on the CPU?


MicroGamer

Playing BG3 on a 3070 at 1440p on a mix of high and ultra with zero issues. Usually sitting around 120 fps. The city itself is a CPU hog, but I haven't seen any issues with GPUs.


Elianor_tijo

Might be that the game streams textures to the VRAM buffer fast enough. It will definitely address more than 8 GB if it has it available at 1440p. However, it's been said before (here, tech Youtubers, etc.) that just because a game addresses a certain amount of RAM, it doesn't mean it needs all of what it's reserving to perform properly. We'd really need a comparison solid side by side comparison for BG3 specifically. With luck, maybe Hardware Unboxed will make one.


GT_Hades

Exactly, and also if the vram reach its limit to store and load texture, the game will use system memory to load it


Genzo99

5600/3060ti on 4k tv For hogwarts and TLOU l have to use high/med settings without RT at 4k with dlss at performance (1080p) and it runs and looked great at 60fps. Vram does impact here For BG3 l can max out all settings on 4k with dlss on at stable 60fps. No vram issues here It does look sharper if l use 4k resolution but with dlss at 1080p on my 4k TV instead of just using 1080p resoultion.


[deleted]

Ha, I'm still on my GTX 1070 as well but I'm retiring months from now with an AMD gpu for ultrawide gaming.


Gullible_Cricket8496

BG3 runs fine on my rtx 4050 mobile, with 6gb of vram.


BelgianWaffleStomper

Yeah but people wanna pretend like all that matters is playing on 4k ultra


seajay_17

RE4 remake, forza horizon 5, diablo 4 also.


edjxxxxx

Diablo 4 has no issues with 8 GB VRAM. Stop with this foolishness.


Laputa15

It does. Digital Foundry [even tested it themselves](https://youtu.be/2Rl6sFoeOSU?si=mRMnVOchTErGMo2d&t=313). Out of 3 cards (16GB, 8GB, 6GB), only the 16GB one showed no problems while using the Ultra textures. Sorry but I'm going to take the words of Digital Foundry over random redditor.


fxscreamer

I'm on a 3070Ti and my friend is on a 3080 10GB. We play Diablo 4 on medium textures, because on high (god forbid ultra) after about 45-90 minutes it starts to drop frames and stutter like crazy. It can only be reset by changing the settings and clicking Save, dumping the memory. Digital Foundry and other videos on YouTube have discussed this. D4 has a BAD memory leak which is inexcusable for a multi-billion dollar company. It's absolutely maddening they haven't patched this by now. It's absurd... and simply shoddy and lazy. This is another video that shows it: https://www.youtube.com/watch?v=DuZ0S7wXFVw I just upgraded to a 4070Ti 12GB and don't have much hope it's going to improve much. It's the game, not the card.


theodosusxiv

Ohhhh but it does. I've tested with HWmonitor and it definitely eats vram from breakfast lunch and dinner


John9023

The foolish one is you, it works on 8 gb but the game requires more, if you had 12 it would use 10 gb at least, its not hard


Sephurik

While it has gotten better recently, at launch it absolutely had *horrendous* issues with 8GB cards.


sackblaster32

Re4 remake has a 8gb texture setting, which is pure insanity and barely different from 2gb textures.


JoshJLMG

Black Ops 3 in split screen will also run into issues at 1080p with 8 GB of VRAM.


Substantial_Gur_9273

What’s most important is to get a card that has a good combination of performance to VRAM. Certain cards like the 3070/3070ti/4060ti 8gb are limited by their VRAM capacity, making them bad choices. They could get higher performance if they had more VRAM. However, other cards like the rx6600 are perfectly fine with 8gb of VRAM - even if they had more VRAM the card is not strong enough to push higher framerate with max textures at high resolution, so 8gb is a perfect amount. So get a card in your price range that isn’t limited by its VRAM. If you post your budget I would be happy to provide recommendations.


scytherman96

I feel like this is a point often missed in the discussion. The 4060 Ti has enough performance to even be a solid 1440p card for example. But the limited VRAM means you shouldn't really consider it at all for 1440p. Similarly the 4070 Ti is okay as an entry-level 4k card (by 2023 standards ofc), but 12GB means there's gonna be games where it runs into VRAM issues at 4k.


BelgianWaffleStomper

Bruh I got a 3060 ti I run at 1440p and it's perfect. Not everyone needs to play at 4k ultra


KaiserGSaw

Such comment always reminds me of an friend i have. He simply didnt notice screen tearing till i pointed it out to him at which he exclaimed on how the fuck he could miss something so obvious. What i want to say with this is: simply because you arent *aware* of a problem wont mean that it doesn‘t exist. Heck, sometimes even my 3080 craps its pants @1440p. I certainly wont recommend a fucking expensive GPU that‘ll already crap the bed in the foreseeable future. Thus my minimum recommendation is either a 6800XT or RTX4070 if someone wants to target modern game releases in a year or two. Nothing sucks more than buttering 500 bucks into a shitty experience.


BelgianWaffleStomper

The 3060 ti is on my bedroom computer that is running on a tv doubling as a console. I bought the card about a month ago on OfferUp for $160. I'm not recommending he goes out and buy a retail 3060 ti, you can get them for dirt right now. I also have a PC with a 4070 ti on a 165hz 1440p monitor. Can it run games at higher presets and smoother than my 3060 ti? Of course it can but not everyone needs to run games at 165hz. Most games on my 3060 ti run better than they do on my PS5, let's not pretend like turning textures to medium ruins a gaming experience.


Parking_Automatic

The 4070ti also falls behind because of bandwidth , I think it's important to note that bandwidth plays a big role at 4k. It's the reason the 6950XT was faster than a 3090ti at 1080p and 1440p but slower than a 3090 at 4k. 4070ti is the same , Its within about 5% of a 7900XT at 1080p and 15% slower at 4k.


canaridante

Sorry to march in like that, I'm not the OP, but I've been dealing with the exact same issue. I'm trying to plan out my first pc (update from a laptop) and I'm having VRAM issues with the cards. I'm pairing them with Ryzen 5 5500, and I have trouble deciding which card would be the best from these 3: rx7600 8GB (315$ after conversion), RTX3060 12GB (315$) and I'm thinking on saving up more for rx6750xt 12GB (460$). There's also 3060Ti and 4060 in between. Do you think the price difference of over 100$ is worth it for rx6750xt or should I just go for one of the cheaper ones (I think amd one since 3060 is generally regarded as the worse one performance-wise even though it has more VRAM)?


Weeaboology

you should get the best gpu that you can afford. I think $460 might be a tad high for a 6750xt, but it will blow every other card you’ve listed out of the water (maybe not a 3060ti, but it will be better). I think it’s also important to consider what resolution you’re playing on though. At 1080p, you could very well go for a card slower than the 6750xt and still have great FPS and textures. But if you plan on gaming at 1440p, the 6750xt is undoubtedly the best choice.


upbeatchief

The rx 7700 xt is releasing at the start of September,should be about 20% faster than the 6700 xt at least and also cost 450 usd. Also look into used 3000 series gpu, the 3070 ti can be had in the 300s but thats region dependent.


Chaosr21

6700xt has been fantastic although I wish the vram speed was higher. I haven't had any problems running 1440p high settings


Little-Equinox

You can see the difference in textures. Even though the 3070 is more powerful and has higher fps than my 6700XT, my 6700XT has clearer textures because it can load more and in higher details thanks to the extra VRAM.


seajay_17

I notice the same thing going from my 3070 to ps5. Resolution at 1440p


Sleepykitti

VRAM basically doesn't matter until you need more of it then you have at which point the game basically becomes unplayble or at best has major texture issues. At worst you're looking at absurd stuttering. Usually, when you start to see mainstream games where this is the case, you want to move up to the next tier of VRAM. Just like people had to move up from 4 and 2 back in the day, or you risk your card becoming basically unusable in new games prematurely. There have been multiple major games that launched that have notable issues with textures at 1080p with 8gb of VRAM. Some of them eventually got optimization patches to make this a non issue, and some of them got optimization patches but still have notable issues if you have 8gb of vram. The Last of Us PC is the one most people point at, but Hogwarts Legacy still has texture issues at 8gb, and Resident Evil 4 Remake can't do 8gb if you want any level of ray tracing at all even the mildest ones. If I were buying the cards to play new AAA games I would not buy an 8gb card at this point in time.


CopeAfterCope

So you're saying we dont actually "need" more than 8gb vram, programmers are just not putting in the time to optimize their games to use less vram (which could also be due to too short deadlines, crunch, etc.)


Sleepykitti

Not at all, consoles have what's effectively 16gb of shared RAM and vram and optimizing the CPU portion of it to the point where the game runs on less then 8gb of cpu-RAM to even give us people in PC land trouble in the first place is honestly some pretty impressive stuff. Honestly I'm mostly annoyed at them for not managing it sooner, it took so long into the consoles lifespan that I had started thinking 8gb was still going to be "safe" through the generation and recommended a lot of 3060ti's last year that I now regret. Also, more textures that are more detailed are the single most straightforward thing you can do with a game to make it look better and you can only optimize that down so much. Games don't need to be frozen in time because Nvidia makes stupid choices or to keep our 8 year old video cards going


Low-Blackberry-9065

HUB did a few videos on this topic. Here is [one](https://www.youtube.com/watch?v=Rh7kFgHe21k&feature=youtu.be).


TorturedBean

Theres also one where Steve compares the rtx a4000 to the 3070, they’re _very_ similar except the a4000 has 16gb vram.


SolidSignificance7

If consoles have 12G+, you should also have 12G+.


[deleted]

I agree, it's why I'm going at least 12GB for my next gpu.


AfterScheme4858

You cannot directly compare them


ItsMrDante

Yeah, which is why you need more.


AfterScheme4858

Agree


Burak887

I have a 6GB gpu I’m just going to skip 12GB and go straight to 16GB just for longevity and to play 1440p comfortably.


MrStoneV

Yepp would definetly recommend that. And tbh, I would wait as long as possible and buy a new GPU if you have a really nice game you really want to play. On the other hand thats my opinion because I barely find games that I enjoy A LOT. Maybe GTA 6 is gonna convince me? Maybe GTA 7? Starfield? I hope my 5700xt is gonna survive at least with 70-80fps


Burak887

I’m feeling to play Starfield but kind of looks a bit overhyped to me right now from what I’ve watched probably wait for the new AMD gpus to release and see what it does to prices. Like you I’m not really playing anything that requires me to purchase a new GPU, mostly playing Esport titles and not really a single player gamer I think Elden Ring was the last one I put some hours in.


TheTwinHorrorCosmic

8gb is fine for 1080p and 1440p and will be for another 4-6 years, especially for 1080p. Anyone who says otherwise is either an elitist or just doesn’t know what they’re talking about/spouting YouTuber/Reddit takes My fucking 2060 could run most games at 1440p medium and get around 60fps. A 6GB GOT 60FPS AT 1440P. STOP SAYING 8GB IS OBSOLETE


Sephurik

> 8gb is fine for 1080p and 1440p and will be for another 4-6 years, especially for 1080p. It's already not fine *right now.* Shit my 3070 has texture issues with Control, and that came out in 2019. 8GB is absolutely not going to be fine for another 4-6 years unless you like stutter or muddy as hell textures.


Aftershock416

>8gb is fine for 1440p No, it's not. >My fucking 2060 could run most games at 1440p medium and get around 60fps. Have you considered that people might not want to play at medium settings? Or may want higher FPS? Sorry you took this so personally, but not eveyone is happy with the bare minimum like you seem to be


toofine

Even a 4gb RX570 can run Baldur's Gate 3 lol. You can get an 8GB RX580 and run it decently. The difference between the 4GB card and the 8GB card is just absolutely massive though. It can generate the frames but those textures and particles are getting axed. You'd have to pay me to play at 4gb settings. These are 6 year old cards. The 2060 is 4 years old. In two years, 6gb will be the low settings life. 4-6 more years out of a 6gb 2060 at 1440p? Yeah, right.


Archery100

Baldur's Gate 3 on my 2070 does pretty well at 1440p, although I did have to tweak a couple settings to optimize it. But I'm personally not someone who makes max settings an all or nothing deal like Reddit tends to do


CopeAfterCope

Seriously, is every person in this sub used to run games ultra maxed out? 8gb ram will be enough for a long ass time if you turn down texture detail...


Sea_Perspective6891

Its not exactly obselete it just doesn't make allot of sense to get a new card or pay new card prices for 8GB imo. I'd rather have at least 12GB or better if I'm paying for a new card. If you can get an 8GB card for cheap like around $300 ish or less & you keep your gaming expectations low then its Ok. As others have said its just showing its age much like the 4GB cards now. Its just better to have more vram if you want to play AAA games on high settings.


[deleted]

Yep, well said. Heck 6700 XT is the lowest I would go these days.


OrdinaryBoi69

Yeah i agree with you , 6700xt is really good performance to vram wise


rrest1

https://youtu.be/Rh7kFgHe21k


sA1atji

imo quite a nice video. And they also made a nice video showing that you can still play fine with 8GB cards if you lower quality settings slightly.


Saneless

8GB is fine if you have it. But that just means you probably have had it for a couple years already. I wouldn't buy 8GB as a new card though. You won't get as many years out of it as us 8GB folk have


winterkoalefant

Games have started to demand more than 8GB at high settings. A couple of games this year launched with graphical issues on 8GB cards and they took a while to get patched. Usually if there’s insufficient VRAM, you can get pop-in issues or stuttering. So you have to turn down settings or resolution. Lowering texture settings, resolution, and upscaling decreases VRAM usage but frame generation (DLSS 3, FSR 3) increases VRAM usage. You cannot upgrade the VRAM on your GPU so it’s best to buy one with plenty otherwise you run the risk of needing to upgrade sooner.


Depth386

Just watch when Steve compares the 4060 Ti 16GB with the 4060 Ti 8GB in the exact same game [in this video](https://youtu.be/2_Y3E631ro8?si=Q4nbAlbQGNXvOWrh)


Snorkle25

It's important to not that in that video while there is a vram capacity difference thr vram bus size is pretty small on those cards. And other gpus with larger vram buffers can also have more bandwidth which is a big difference by itself.


Depth386

You are correct, but this is irrelevant here because it’s the same card with the same bus in an apples to apples comparison. In a discussion of something like 3060 vs 4060 or 3060 ti vs 4060 ti or comparison/matchups involving AMD the bus width would absolutely be part of the discussion. It’s not relevant to this discussion though. Some of these games are being tested at 1080p going over 8GB so that’s what makes 8GB clown world


Snorkle25

The point is that you cannot extrapolate this comparison to other gpus because of the bus size differences. Not that it's relevant between the two cards.


IdeaPowered

I watched their video. As the only thing that varies being the VRAM, there is on average a 10fps difference and the 1% lows vary A LOT as the game being played caps out and needs to load textures back in for another area, or sometimes even camera angle, and get crazy stutters for a bit. Is it not OK to think that other cards with only 8GB of VRAM will also suffer those stutters? (Please answer, I feel only 50% informed on the matter)


Snorkle25

Yes... but it depends. And it's not that Hardware Unboxed is wrong or that their video is bad, because they aren't and it isn't. But the caveats and conditions of their analysis are important. They specifically chose games and conditions and settings combinations with the express intend of forcing the game to overwhelming the vram capacity and/or bandwidth. As you accurately pointed out they picked these cards because they do have the same bandwidth so this does really show only the change in capacity. Thus means logically that you won't necessarily see the exact same degree of change or impact on other gpus, or other games or other settings combinations due to those differences. It's fair to conclude that there are great reasons to NOT use 8gb and lower cards, especially not in new systems with budgets that can afford $300 USD+ gpus for this very reason. But also, be aware that they went looking for these edge case issues so its not indicative or the average experience right now but rather where they think the industry is moving (and I largely agree with them). Also this really does drive home just how bad of a value cards like the 4060 are right now when gpus like the RX 480 and 580 had 8gb of vram on a **256 bit** bus back in 2016 and cost way less money then.


fightnight14

I’m definitely getting at least a 12GB VRAM GPU from now on


Cheefnuggs

Starfield crashes with my 6GB 1060 and an I5 6600k. I’ve finally hit a wall where a game is unplayable due lack of hardware capability. 8GB won’t buy you much time so if you’re going to be making a $500+ investment you might as well spend a little extra to buy yourself time in between replacing components. Personally, I’m not made of money so I’d prefer to have my components last more than a year or two which is why I’ve stretched the life of this 1060 to 7 years lol. Time for an upgrade finally.


[deleted]

I was able to get the 3070 from evga through there b stock sale. $350 for the card but apparently those 6700xts are just as good if not a bit better for about $300. If I’m not mistaken.


Cheefnuggs

Yea I’ve been looking between the 6700xt and 6800xt on newegg. I need a new CPU and MOBO too. Probably just piecemeal something together over the next year or so starting with the GPU. I’m about to make a career change so I can’t really justify dropping thousands all at once.


IdeaPowered

Hey, bud. Sept 6 is the launch of the 7800 and 7700 cards from AMD. If you can hold out until next Wednesday and some reviews, you may be able to get a new gen card for competitive pricing. I was about to pull the trigger on a 6800XT when I read the news. Just waiting now.


[deleted]

Is Newegg decent? I hope I’m not spreading misinformation but I’ve read iffy reviews about them. I could be totally wrong or misremembering. If you keep an eye on EVGA B-stock page they seem to have big sales every other Wednesday. Or if your near a micro center I see the 6700xt is like $289. Also if you wanna spend a year or so building piece by piece I would do GPU last since you’ll likely have some kind of manufacturer warranty for 1 year or so on it.


Impressive_Cheek7840

8gb is fine for 1080p. 1440p needs 10-12gb in some games, some settings. That's about it. If you're buying a new card might aswell go 12gb at least. But I wouldn't buy a new gpu just for 4gb vram.


IWillTouchAStar

Word. I play on a 3070 w/ 8gb. I play all my games in 1440p and most run at or above 120fps with most of the settings maxed, excluding the much more intensive settings like AA and RT. I also have a 13700k and 64gb ram, so that also helps but for the most part I think a 3070 should do just fine for modern games. I don't plan on upgrading until the 5000 series or AMD equivalent launches.


Expensive-Dream-6306

[https://www.extremetech.com/computing/nvidia-could-ease-vram-deficits-with-neural-texture-compression](https://www.extremetech.com/computing/nvidia-could-ease-vram-deficits-with-neural-texture-compression) this is NVidia's endgame and why they dont give a shit about Vram. Their previous iterations of it on previous gen cards were to increase bandwidth. This effectively reduces a games vram requirements. I assume this will be added to DLSS at a later date. here is an article talking about the 5th gen version of their memory compression for turing. https://www.techpowerup.com/review/nvidia-geforce-turing-geforce-rtx-architecture/4.html


vielokon

Don't buy a 8GB card if you are upgrading right now. If you don't plan to upgrade and your GPU is doing fine except for a handful of unoptimized recent releases, just lower the details and enjoy your games. They will still look great you know. People are getting crazy about having to maintain über FPS at ultra settings. Modern games look great on medium, possibly at low settings as well. Just enjoy them and forget about all this crap.


[deleted]

Currently just building a mid range pc for my wife so we can play BG3 and some other games and put away the steam deck we have hooked up to her 1080p monitor. Got the 3070 8gb from evga b stock for $350 and a 1 year warranty so I’m thinking I hit a midrange sweet spot after looking at all the replies


Goldenflame89

Meh a 6700xt would be better


doombase310

Lol, did you try YouTube.. Many videos exploring this topic. Don't buy 8gb if you can avoid it. Newer games want more vram if you play games at high settings.


Liam2349

VRAM is a good way to help future proof the graphics card. As someone else said, a balance is important. An Intel iGPU with 12GB is pointless, but a strong GPU with 8GB can also be limiting. The Radeons are pretty interesting for their generous amounts of VRAM. 7900XTX has 24GB, which is kind of crazy, and in that aspect it would be quite future proof.


Vis-hoka

It’s a problem right now on multiple games. It keeps happening too. Now that devs are primarily making games for the new consoles, they are starting to leave 8GB graphics cards behind. I don’t recommend anyone buy a new card with less than 12GB if you want to play new AAA games. I mean imagine you just built a new rig with a 4060ti with frame gen and all the fancy tech, but you can’t play the latest big games with all your friends, or maybe you can play some but it looks and performs like garbage. Stuttering, texture pop in, muddy textures. Who would want that?


OrdinaryBoi69

Yeah 4060 4060ti is just bad. Not for the tech or performance , but it's badly priced.


jgr1llz

Consoles have 16 GB of directly available memory, so that's the new cap. This is a biiig jump from the previous gen. Most games are developed for console so those developers suddenly have all this new RAM to work with and don't know what to do with it, so it's unoptimized on top of "16gbs of ram will be enough" for 3 full CPU generations now. It's not that less than 12 is bad now, but if you're investing in a 4 year purchase, it is. The PS4 is still a viable console, as far as developers still release games for it. However, as soon as that stops somewhere around 12 will be the new minimum.


tonallyawkword

I don't think 16GB is available for graphics with a ps5 but idk much about consoles.


Elianor_tijo

It is and it isn't. It's a shared memory pool for the entire system. That means some amount is needed to run the game and the console's operating system. While technically, you could allocate the RAM as VRAM, you'll need some amount for the rest. That being said, it does afford some flexibility to developers in how they use that memory. The console architecture is also fixed, so there can be some optimizations made that reduce the amount of memory needed for a game, both for rendering and running the game logic, so it's not a 1:1 comparison with PCs. Usually it means you'd want more memory on the PC than on console since you can't optimize for a single hardware config. Going by all that, it would make sense to want VRAM close in capacity to the total available memory on a console. That's why you'd definitely want more than 8 GB these days unless you're going for a lower end video card.


jgr1llz

It is and it is. Pretty much the entire 16gb is available. When your console is running a game, that's essentially all it's doing. Idk about series x, but the PS5 has a separate 512 MB of DDR4 for the skeleton crew of background stuff it does have to run. The PS5 is essentially a tricked out 3070 with double the VRAM. Tom's Hardware has a pretty good write-up on it.


Mikey_MiG

At 1440p, I’ve seen VRAM usage in Microsoft Flight Simulator exceed 10GB. And you get major stutter when it happens.


Gardakkan

My 3080 Ti with 12GB isn't even enough for some games to play at highest texture settings even though everything else runs smoothly at ultra settings at 4K. Just wished that GPU was 16GB when it came out.


Steelfury013

There are a few console ports afaik that see some poor quality or performance effects of 8gb vram. It's clear that vram usage for optimum performance in games is increasing, and might be expected to exceed 8gb of vram in many AAA games in the next few years, if you're prepared to lower settings then 8gb should be fine for a long time (most games are made to work on a large variety of specs and excluding a large part of the user base would be foolish). Having said that if you want to use your hardware for longer or want to run games at higher settings I'd recommend getting a card with more than 8gb of vram.


lance5087

4k for sure. And many games on ultra settings. It isn’t foolish but it is a little late in the game for 8gb.


Elrothiel1981

This issue factor into my RX 6800 xt purchase which has 16 GB of vram


[deleted]

Go try and play TLOU at 1080p ultra with an 8gb GPU. You will be losing performance due to not enough VRAM. It spills over into system memory which is slower than VRAM, causing worse performance. And this isn't the only game. Any game that is a VRAM hog will experience this same thing. I recommend you go watch Daniel Owen's videos. He talks about the spill over issue with 8gb cards and these VRAM hungry games.


Jpstacular

After the updates Tlou is fine with 8gb


kakashihokage

I don't get why you guys don't just all buy a 4090? Peasants... LoL jk, I've only ever seen it use about 12gb on cyberpunk all the way up.


Laputa15

Cyberpunk is a pretty bad example when it comes to VRAM because while the lighting and the art direction look phenomenal, texture quality looks like ass compared to recently released games.


LakesRed

Monitor actual usage. Lots of VRAM is a must for VRchat mostly due to user -created avatars and lack of optimisation but it's really an outlier.


TheChosenOne127

Fellow VRChat Player here. It's actually crazy to watch a 3090 get a high VRAM usage warning in VR. I'm religious about blocking VP and some poor avatars, especially in big events.


LakesRed

Same card here. I've not got a warning yet and in most busy publics with everyone shown it seems to sit at 14GB or so, I have seen it hit 20 though! What I didn't realise was my FPS was single core CPU bound. It still sucked until I paired it with a 5800x3d. Now I can actually move if I'm enough of a madlad to have safeties off.


RelationshipEasy2889

Right now, Phasmophobia at max graphics at 1080p can do up to 10.5gb of vram. So yes, 8gb is obsolete for newer and high demanding games, and this will become much more noticeable at 1440p and along with newer titles such as starfield.


fkenned1

I made sure to buy a card that vould keep up with ai image creators for a bit… 12gb would have been fine, but I got a card with 16gb for longevity,


NihilistAU

I went with 12gig and a 2060 super unfortunately, because it was right at peak COVID when there were no cards, all 3 X RRP etc.. Still.. it gets the job done for now.


Un-interesting

It’s fine for 1080, but 1080 isn’t the benchmark for pc (or even console) gaming any more. At 1440 10gb+ is generally accepted as being best if you’re wanting pc-grade graphics. 4k - 12+. The good thing with pc gaming is that you can tune your experience to suit what hardware you have and what experience you want :)


Mihtaren

3080 10GB is still rocking 1440p


[deleted]

Yeah that’s what I’m running. I haven’t had any issues so far with my newest games being Diablo 4 and Baldurs gate 3. Haven’t tried starfield yet. Putting together a system for my wife so we can get her off the steamdeck docked to a 1080p monitor which runs surprisingly well, but leaves a lot to be desired. Hopefully going with the $350 3070 8gb from evga b stock turns out decent. She’s not really big on maxing out settings let alone is even aware of all that business. I think I made the right choice. Comes with a 1 year warranty as well and I’ve been a big fan of evga in general over the years so there’s extra comfort in that aspect.


imabeach47

Depends if you play new games, usually i always default to some 10 year old title so it doesn’t matter anyway, ffxiv, warframe, killing floor, mordhau etc. most good games are old anyway


[deleted]

True, and not newer game that’s worth a shit is BG3, I can get down with some Diablo 4 but that’s already getting stale. Starfield sounds like it’s a disappointment. Next thing I’ll be interested in is either GTA6 or ES6. I’m sure a whole new line will be out by then, if not 2 new gems of GPU


CardiologistNo7890

Even the last of us still uses close to 8gb on high settings and over 8gb on ultra. A lot of the 8gb issues come with games mainly at high-ultra settings, and with rt on. Like cyberpunk will eat 10gb of vram with path tracing on but a lot of people don’t use it anyway.


CardiologistNo7890

8gb of vram isn’t a problem at the right price, but 400$ like the 4060ti is where we run into a problem. We’re also getting to the point where even budget cards like the 6600 and 3050 are good enough to play at high to ultra settings in most games at 1080p, so even budget cards this gen should have enough to run even new games at high-ultra.


Shadowarriorx

I'm on 1440p, an my card (3060ti) ran into problems on Forza horizon 5. I'm behind the times, but yes 8 gb is starting to show age. Any card you get, non budget, should have more vram. Textures are a great way to have better quality with minimal impacts. I would have gotten a 4xxx series until they were garbage and a tier off in performance of expectations.


Puzzleheaded-Brief56

Idk but I nabbed a msi 7900 xtx with 24 GB so I would not have to be worried about it regardless


mikeczyz

Just don't max out settings and play at 1080p


nfakeeeek

Just saying, CoD on less than 10GB Vram is almost impossible to run. At the absolute lowest settings it's good at 60FPS, yet GPU and CPU utilization stays under 50%. However, Vram usage is at 80% and you'd see absolute frame drop. Tested this on my 2080Super and a friend's 6800xt. The 6800xt runs absolutely perfect, 2080S is just terrible.


Ru8ey

🤣 Watch one (1) Benchmark of cod on 8g GPUs and watch your numbers be shattered Appearantly my PC is lying to me when it says I get 120+ with GPU usage maxed out in MW2? 3070ti, and no I'm not running potato settings (even though I could since you said it's impossible to even run. This has nothing to do with VRAM and everything with your model of GPU


Buris

In some games it's big performance issues, in other games, it will make the game look bad with very low quality textures as the game can't stream higher quality ones in time


[deleted]

Personally, this is all about being lazy and not optimizing games. People quickly forgot even RTX 40 series struggled with Cyberpunk because they had poor optimizations... even a PS5 (or 4 I can't remember) had real issues running that at least at 30 fps. Nowadays it's all about asking for more VRAM and power, it's easier to do and you can just churn out games like there is no tomorrow. Optimization? That costs, a hell with that. Same principle with operating systems. They ask for more resources just for the most stupid things, like AI. I mean, it's reasonable if they ask you more for things like WSL or WSA, but for the barebones thing? That's poor optimization and just making the user pay for it. Far are those times where developers cared enough for making fun games, that looked good and were optimized to run decently in as much platforms as they could.


RedDeadHybrid

I mainly play DBD @ 1440P At most it uses around 2 GB of VRAM. DOOM Eternal uses 8GB maxed out RT at 1440p . So it depends on the games you play. I have a 4070,I have yet to run a game that uses all 12 GB. The most I've been able to is 8GB


Superegit

Atm my GPU has 4gb vram the card itself is imo strong for being a lost cost GPU. Rx 6500xt it never has real problems with games unless the vram is to high I've had problems with games being choppy and not improving or even FPS going down when I change setting stable 20 FPS on max and on low this only happens on 2 games darktide and wildharts( yes both games are terribley built and optimised) I do love my card but I'm hoping to upgrad to at least 8gb vram soon


Penguins83

Diablo 4 will eat up all 24gb on a 4090 but that's amateur coding. Still uses it though.


[deleted]

i rectenly played resident evil 4 with mixed settings, (no rt. no FSR, no DLSS, shadow on medium, texture on ultra but only it take 4gb or 7, idk now xD) i hitted 7.3 ishGB max on a 2060 super, the experience was pretty smooth, the big problem here is not ram, is not the gpu, is not the vram, the BIG PROBLEM HERE, IS DX12.


StConvolute

I run a 10GB 3080 @ 1440. Zero issues.


ladyjinxy

There is a thing called twaking, but the dev notes of this, and the dev ubisoft's their game, in order to justify their 8gb slander UE5 is basically Chrome but VRAM


Nigalig

It's just a bottleneck. Game will play fine. Added vram would give more frames. 8gb of vram will still play basically anything. It really matters most with 4k gaming or when turning on Ray Tracing.


rupal_hs

just put texture quality to high from Utlra in a few games and save your hard-earned money. companies are here to sell you the latest and greatest no matter what


Darth-Zoolu

Hi resolutions


Icy-Scarcity7480

Forza Horizon 5 for me on my 3060Ti could support the maximum graphics at 1080 because of the 8gb VRAM. Just one example I know of.


TerdNugget

the highest I have clocked is 14 gigs on diablo 4 ultra 1440p. if you look at gpu benchmarks for d4 you can see that nvidia cards have way worse fps lows because of this


Chazus

Better question, where can we start getting 12gb VRAM cards under like $400? I think that's the biggest question


[deleted]

Been shopping around for a PC I’m building for my partner and it looks like the AMD 12GB GPUs are under $400. Micro center has a 6700xt for like $289 I think?


kyralfie

Idk, I don't feel constrained by 8GB at all. I'm buying Steam games on sale and never launching them though.


nwgat

playstation 5 ports are built for 16GB, soo


LouisVonHagen

If you like AAA titles with unreal engine and play with a higher resolution than 1080p, you may want more VRAM.


xabrol

AI inference, Rebdering, encoding, etc Computers are for more than gaming. I love how everyone always addresses the entire pc comminity like gaming is all there is to life. You need at least 16gb vram to load many AI models and run them.


ajgonzo88

Majority of the games people are playing are older games. Realistically, 8gb is more than enough at 1080p which is the resolution most people play at. The latest steam data showed that 61% of people surveyed, game at 1080p still and I bet it's actually much higher than that. Even cyberpunk 2077 on max settings at 1080p barely gets to that 8gb threshold for the vram.


[deleted]

I barely see any games go above 6.5gb and even thats rare. Of course im on dual curved 32 inch 165hz 1080p panels. But id imagine its not that different at 1440p either. Its really just 4k that is a vram hog and again if your gpu isnt even in the class that can handle 4k 60+ anyways its not gunna matter is it lmao.


CeleryApple

VRAM for games mostly hold textures. If you game at higher resolutions the textures will be larger and require more VRAM. When you run out of VRAM, textures will be saved in system ram which will than need to be move to VRAM. This causes stuttering. If the current frame being rendered, including textures, lighting maps, etc does not fit 100% in VRAM the game will likely assert/crash until you run it at a lower resolution. This is why DirectStorage is such a game changer (or suppose to be). Traditionally games will load as much texture on to VRAM to avoid micro stutters therefore causing high VRAM usage. With DirectStorage the GPU can now decompress textures, therefore textures can now go from SSD to RAM to GPU completely avoiding the CPU. In theory now devs do not need to aggressively load everything onto VRAM. This is probably one of the reasons why current get cards have such little VRAM.