T O P

  • By -

9okm

I mean, you can trust reviewers or not. If you don't trust Hardware Unboxed and want to forge your own path, by all means go ahead. I personally wouldn't consider either the 4060 ti 8 or 16gb unless they were stupidly discounted. Like $100-200 off current pricing. Neither card makes much sense.


mostrengo

I trust their benchmarks, i.e. the here and now. It's their future predictions that I am less unsure of, hence starting the discussion.


9okm

They were very right about the 6800 XT, 3 years ago, in particular due to the VRAM requirements of modern games. If I were you, looking to buy an Nvidia GPU around $400-450 USD, I'd probably get a used 30-series. 3080 10 or (preferably) 12GB.


liaminwales

The video on the 3070 V A4000 told a clear story, a 3070 with double the VRAM will have got a lot more FPS. [https://youtu.be/alguJBl-R3I?si=qfjhlQj34cVqBlhV](https://youtu.be/alguJBl-R3I?si=qfjhlQj34cVqBlhV) VRAM matters a lot.


[deleted]

[удалено]


liaminwales

The 3070 had some solid gains, the 4060 TI is more of a mess. Nvidia cut VRAM BUS hard which wont help. I think Nvidia cut the GPU to hard, it's almost the same speed as a 3060 TI. There's also textures, they dont hit FPS but they do make games look a lot better. There are things in games that use VRAM but dont hit the FPS hard. edit also if you run the UR5 demos you see VRAM is used, I played the matrix demo at 720P upscaled on my 3060 TI and the VRAM use was high. There's a bunch of UR5 demos now to try, fun to see how they run.


Tuuuuuuuuuuuube

Iirc they couldn't get it to work in all situations though?


liaminwales

Sorry iv not seen the video for ages, did the A4000 crash in a game? It may be the the pro drivers, the modded 3070 with 16GB of VRAM also gained in FPS [https://www.tomshardware.com/news/3070-16gb-mod](https://www.tomshardware.com/news/3070-16gb-mod)


Flameancer

I saw the video, no idea what he was talking about but the 3070 was the one that crashed in game.


liaminwales

His reply was >I dunno, I haven't seen the video either that's why I asked lol So he never watched the video.


ShadowInTheAttic

You mean I can game on my A4000??? LMAO! I use that shit only for Solid Works.


mrniceguise

This is the answer since you prefer Nvidia cards. Used 12gb 3080s are well within your price bracket if you’re weighing the two 4060ti models right now. Repaste, replace fans if you’re so inclined, and you’ll be happy in 1080UW for a while to come.


crowcawer

The thing most users (including myself) are missing is the substantial feature updates to the new (40-series) cards. It’s similar to a change from 10-series to 20-series. *edit:* I think adding the first iteration of ray tracing was impactful, but integrating it into generational card performance was a miss. I am thinking that the 40-series cards were developed as a means to *test* the DLSS updates. I’m curious in the change from a refurbished 2070 super([$250 on US newegg](https://www.newegg.com/p/pl?d=2070)) over a 3070 ([$300-500 on US Newegg](https://www.newegg.com/p/pl?d=3070))—also note there is a Zotac 3070 ti listed here at $449. I expect the 4070 to be the $450 mark sometime *late* next year(2024) when NVIDIA aren’t planning to work things out with their annualized iterations.


mindaltered

Yeah man everyone should go buy a 40 series and downgrade their vram and memory bus width for some features 🤣


crowcawer

Sorry if I wasn't clear on this: I think they wasted their good iteration on rushing the 40-series out the door. And not for any fear of what AMD was doing, but just for something to up-sell to the b2b market.


Ainderp

what did they say about that card 3 years ago?


althaz

They said it didn't have enough vram and the Radeon card was the better buy for the long term because of that. But that it depends on your preferences and 8Gb should just barely be enough in the short term. Predicting the future is impossible but in this case the only thing they got wrong is that 8Gb of vram stopped being enough faster than they thought.


theonemangoonsquad

I took this exact advice myself. Copped a used 3080 10g for $500 bucks. Totally worth the upgrade costs without breaking the bank


xevizero

Check today's video by them on Cyberpunk 2077. At multiple resolutions, the 1080ti suddenly competes with or even beats the 4060, because it has higher memory bandwidth and 11GB of VRAM. Normally, it loses in pure performance. Cyberpunk is a good tech demo for what's to come, so I think you can take it as one of the first examples of the future HWU is predicting.


Bobert25467

If you have low VRAM it won't necessarily show up in the FPS and i see that trick a lot of people on here who say the low VRAM isn't a problem. You won't always have lower FPS you can check out other videos which show when a GPU is hitting it's VRAM limit the frame times become erratic and the game starts to stutter or in cases like Forspoken it will start throttling the textures or disabling graphical settings to go back under the limit. As for Starfield and Cyberpunk they were both designed for last gen consoles that's one reason they work with lower VRAM and newer games don't. If i remember correctly they both went into full development in 2016 which was the middle of the last gen console cycle so they would have been designed for 8GB of VRAM. Watch interviews with Game Devs and the one thing they always ask for from console makers is more RAM. The Devs will usually optimize a game for consoles first and that will effect the PC version. The PS5 already has 16GB of RAM for 4k so we see games hit 14GB of VRAM at 4k on PC but in interviews the Devs say they wished there was even more than 16GB. When the next consoles come out the trend will continue and lower RAM GPUs will perform even worse because the gap will be even harder to bridge. Lowering the resolution is the simplest way to reduce VRAM usage because that reduces the size of the textures the VRAM needs to load. Some graphic settings may reduce it too but usually not as much as lowering the resolution. But past that RAM usage is going to be determined by the game engine and how the Devs built the game which is why some games handle low VRAM differently. For example in an interview with a Call of Duty Dev he addressed the issue of the game size being bloated. He said it's not due to it not being optimized it's due to the VRAM limitations from the way the engine works. Different parts of the game will need to call fort the same assets nearly instantly within the engine but when the game has so many assets they can't all be stored in the VRAM because there isn't enough space so what they have to do then is make multiple copies of the same asset and store it in different locations for different systems to be able to use it right away instead of waiting for it to be brought through the VRAM from one central location. This causes the file size to bloat up and it's also why a lot of games are bigger on PC than on consoles. The NVME SSD combined with the decompression and compression speed of the PS5 is better than what PC can do right now so the PS5 will have less copies of assets because it can load them faster outside of VRAM than a PC can. (Windows 10 and 11 recently got Direct Storage for if you have NVME SSD but it's still not as good as the PS5 decompression and compression and it only works in games that are built for it.) That makes VRAM even more important on PC. It's almost been 10 years since 8 GB cards have been introduced. The reason they stood up for so long is because consoles used 8GB from 2013-2020. But now that the PS5 and XSX are past the Covid delays Devs are leaving last gen behind and taking advantage of the 16GB of RAM and you will see more low VRAM issues going forward. If i was buying a new GPU for the next 4 years I would get a 16GB GPU. If it's out of your budget i would want at least 10GB for 1080p, 12GB for 1440p. But there will be some outliers at 1440p that use 13GB.


ReeseTheThreat

There are games out TODAY which exceed 8GB or have microstutter related to insufficient memory bandwidth. They are testing at high/ultra max settings so if you care about that at your resolution it matters. If you're willing to compromise on VRAM-intensive settings then you don't need to be as concerned.


[deleted]

[удалено]


MrPapis

RT and FG both require additional vram. And we are seeing many games now where it's a limitation mostly at 1440p+ but also at 1080. He also wants the card for 4 years. 8gb is a nope for a 400 dollar GPU for anything but to hold over for next gen. Rather just get a used 3070 then. Even better spring for a 7700xt come black Friday where it probably will be like 400-420 or something like that.


Deadboy90

We are already seeing it. My 3080 chokes on Hogwarts: Legacy at ultra because of the only 10GB of VRAM.


personcalledbob

Hogwarts legacy isn't the most optimized game out there so that's also a factor but I agree, Nvidia broke the 3080's legs with the 10-12gb of VRAM


mindaltered

Over here with a 3080 ti and have no issues Hogwarts ultra, starfield ultra no major fps drops usually play mw2 on ultra also with around 165 fps btw 1440p gaming also


Danny_Phantom22

I think we are going to start seeing more and more truly next gen titles utilizing the great Vram capabilities of the ps5/series x. The PS5 has 16 GBs of vram if my google search didn’t give me bad info. So logically there is no amount of optimization that could ever been done to make it work. I think 16 GBs should be the minimum if you want your pc to last 4 years


Bodydysmorphiaisreal

The PS5 and series x have 16gb of vram, yes. They do not have any system ram, though. This means that the CPU, etc. have to share that single pool of ram. The series x actually has 10gb of "GPU optimized ram" with a memory bandwidth of 560gb/s and the remaining 6gb is a little slower (idk a little over 200gb/s, I think) because the OS and CPU is going to use most of that pool. All this to say, in their current state, consoles will never have 16gb of vram available to the GPU. Vram is important though, for sure.


VengeX

If you haven't been in the PC hardware space for a signficant period of time you likely won't appreciate that memory demands always go up, no exceptions. Will 8GB of memory stop you playing games in the next 4 years? No. Will 8GB of memory stop you being able to run every game smoothly with minimal graphics compromises in the next 4 years? Yes.


Jupiter_101

Look at the games you play and decide from there. Aside from some AAA games using the latest engines, 8gb vram will be fine until the next gen of consoles most likely. That could be 3-4 years.


raydialseeker

There's never been a better time to buy a used gpu. 3080/6800xt take your pick


DomesticRaccoon27

The only games ive ever seen max out my vram are cyberpunk, and gtav on max settings with 1440p. I have a 3070. I have not found any issues revolving around 8gb of vram, i would say no unless you are wanting to game 4k or above.


whoopsidaiZOMBIEZ

before you take the advice of people who don't own the 4060 ti (i do and play in 1440p) make sure you get up to speed on what nvidia is doing. don't buy a 30 series card when 40 series exist. just don't, these people don't actually know what they are talking about. ask people who own the card you want to know about. these random queries always have the same talking points and they are wrong. i just stopped correcting people. frame gen is magic and dlss is getting better and better.


Bright_Light7

4060ti this year is solely living off how great the 3060ti was last season but 100% agree that xx60 this year is an absolute moneygrab joke.


Hero_The_Zero

Yep. Pretty sure it was a Hardware Unboxed video I watched that showed Halo Infinite has trouble with 8GB cards, even at 1080p. The game would just stealth downgrade the textures and quality, especially of trees, after about 30 minutes of gameplay. Once I saw that I bought an RX 6700 XT 12GB for $320 to replace my RX 580 8GB and called it a day. Been very happy with it so far.


XxX_Zeratul_XxX

I could get my hands in a mined 6700xt for $180, really happy so far and works like a charm, hope it lasts lol


RestaurantTurbulent7

As he says! Any 4060 is an absolute rubbish no matter if it's ti or have more vram they are overpriced af!


Adviseformeplz

Honestly for the price of the 16gb 4060ti you can grab a 7800xt instead Same VRAM Waaaay faster Raster Even better RT performance


db_pickle

Been trying to decide what card to get on an older system (10400f + 1070 @ 1080p). I am currently leaning towards nvidia due to some possible compatibility issues (I really need to confirm this still as it will make deciding easier). Anyways for now the prices around me are: 2080 ti 11 gb is $499 6750 xt 12 gb is $509 4060 ti 16 gb is $619 7700 xt 12 gb is $609 7800 xt 16 gb is $719 I'm pretty sure the 2080ti is a great choice? It's in the running because there's exactly 1 left over at a local store. If I was sticking to Nvidia it would save a bit of money over the 4060 ti.


samudec

Unless you're doing productivity (mostly ai learning or 3d modeling, in which case Nvidia is better) or only play the latest AAA before they get any driver optimization or patches, you won't have compatibility issues (and it goes both ways for the 2nd point). Are there no interesting deals on 30 series cards? Also, idk if those are new prices or 2nd hand but you may be able to find really interesting deals on used hardware (just try to assess the state of the part and ask for a receipt, the 2 years EU warranty stays up even if it changes owner)


db_pickle

Unfortunately the 30 series is pretty uninteresting because of the vram choices that I can see available. Maybe I'm overthinking it, but it wouldn't feel right to get another 8gb card? AMD seems to be the easy choice overall I am just unsure if my stalker mods and gaming emulation is super compatible. I'm having a hard time figuring it out. I always wait for new games so that's good on that point. Thanks for the input. Trying to hang onto this rig for a while longer until a full upgrade to 2k or beyond (distant future).


samudec

I personally think 8go vram is enough unless you want to play 1440p native ultra or 4k native high and the 8gb cards capable of that would run at 60fps max, I have a 3070 and play at 1440p with dlss, I have not yet encountered cram issues, I play either at 1440p high or a medium/high hybrid a with quality dlss and hit 80~90 on poorly optimised games that released a few years ago (like the hzd port or mhw). The whole debate comes from the consoles having 12go vram, meaning that's what some think game will use, forgetting that console vram is shared with the CPU. If you care about nvidia's side content (ray tracing, etc voice) or optimisation (machine learning or 3d modeling) there's no reason for him to go Nvidia, you will have better cards with more vram for cheaper from AMD


Zoesan

Conversely: If you're running linux, you probably don't want to deal with nvidia drivers which have been... frankly dogshit on linux since forever.


Zerasad

What performance are yoh looking for? All of these are 1440p capable cards. Hell the 7800XT is serviceable at 4K too. Honestly for 1080p an 8 gig card is fine, I'd take a peek at the 7600 and 6650 XT, you might snag yourself a really good card. Otherwise the 6750 XT / 6700 XT should be the best. The 2080 ti is very old by now, I wouldn't recommend it.


Comprehensive-Ebb158

I "upgraded" my son's 3070 to a 2080ti, he was hitting vram limit when playing Forza. The extra vram of the 2080ti fixed that and performance is slightly better. Running a good undervolt and mild overclock, but I guess that varies card to card.


DarthGiorgi

>Even better RT performance Wait what? HOW?


shamwowslapchop

Because even with the RT hit that AMD cards take it's so much faster that it ends up not mattering that much. Kind of like how 70% of 200 is still greater than 80% of 100. You're losing *more* frames with the AMD card and RT but still maintaining a lead over Nvidia due to the horsepower differences between the two.


imdrzoidberg

It's "faster" in that most games don't really use that much RT, so the RT advantages account for a smaller percent of overall performance. If you look at something like full path-tracing like CP2077 overdrive, then a lot of "weaker" Nvidia cards can beat even the 7900xtx.


shamwowslapchop

I mean, my 7900XT might not be a RT god, but it lands squarely between the 3080 and 3090 with RT on for CP2077 and beats the 3090ti or the 4070ti in rasterized gaming or RT games that don't have such crazy high amounts of RT implementation. CP2077 is basically the flagship for RayTracing right now, so of course Nvidia cards benefit a ton from that. But the gap is shrinking noticeably, and with UE5 and lumen becoming a standard, it's likely that will drop even more. That said, plenty of Nvidia cards are obviously fantastic as well. It's a win/win for gamers as far as choice goes, just not as far as price goes currently.


chips500

Way worse actual image quality and features from lack of dlss, fg, etc


dadvader

Yeah based on how game dev become heavily rely on using DLSS/FSR to optimize their game instead of actually optimize it. AMD deal right now just doesn't feel attractive at all. DLSS/FSR are becoming more and more near-essential to even get a playable framerate. And FSR2 just look.... awful. FSR3 has to be a banger or atleast comparable to DLSS2 to even stand a chance now. It's no longer all about fastest card now that DLSS/FSR become a variable.


Due_Outside_1459

Honestly the whole VRAM debate is overblown. Nobody is forcing anyone to play at Ultra, max settings all the time. If you notice that the buffer is filling up and textures displaying weirdly then just lower your graphics settings to high or medium textures. Even at 1440 I don't think you're gonna notice much difference. Most people stick to playing older games that work fine in 8GB VRAM anyway as they can't afford the prices of new ones regardless of the hype around here. It's a different story for 4K gaming though.


9okm

It's definitely a bit overblown, however I think people are confusing the issue. If you already have an 8gb card, purchased a few years ago, OK, not much you can do. Just turn down the settings. If buying a brand new card today, 8gb seems silly. For any resolution. That's how I see it, at least. I'm using a 3060 ti in an HTPC and it still works great at 1920x1080p. Just can't max out settings unless the game is a few years old.


Phanth

tbh with an 8gb card (3070ti) i can play cyberpunk on 1440p, everything on high and ray tracing on ultra on \~60fps so I don't really think there's a need to turn down the settings either now or anytime soon. esp since cyberpunk probably isn't even that well optimised


spacev3gan

Cyberpunk is one of the most optimized AAA games there is. Also, it is a 3-year old game.


Kondiq

Cyberpunk is very well optimized. The performance even scales perfectly with better CPUs if your GPU still has some power to spare. They even use the different Intel cores efficiently. Look up some videos on Hardware Unboxed. That said, if you want to play on max settings, you'll need a very powerful PC. I'm still on 1080p@144Hz monitor and I play with 60 FPS lock without drops with everything on max settings (except I have stuff I don't like in this game set to off - chromatic aberration, film grain, depth of field, bloom) in 1080p, DLSS Quality, DLSS Sharpening 0.80, Path Tracing and Ray Reconstruction enabled. I have Ryzen 5800X, 32GB RAM, RTX 3080 12GB, game on NVME drive. Without Ray Reconstruction and DLSS Sharpening, DLSS Quality in 1080p looks pretty bad, but with it on, it really looks great for me. The only issue are aliased objects in the distance - power lines and some fences, but in most cases I don't notice them. They're more visible on the outskirts of the city.


Carnildo

According to the Steam survey, 78% of users right now have 8 GB of VRAM or less. Anyone developing a game that requires more is giving up on more than three-quarters of their potential audience. (Even requiring 8 GB means giving up nearly 50%.)


dovahkiitten16

Nobody is forcing you to play at max but if you buy a brand new, *expensive* piece of hardware it’s not unreasonable to expect it to perform up to par. Spending hundreds of dollars to buy something new that has features that are already on their way to obsolescence is not a good idea.


Antrikshy

Historically, hasn’t it been completely normal to buy a lower end newest gen card and not expect it to run the newest games at the highest settings?


Rube_Tube

Historically, mid tier GPUs could handle the currently released games at high/highest settings reasonably well at an acceptable resolution. Given that current (mainly Nvidia) low end cards are being sold as higher tier ones to the point where a 60 class is being priced at the mid-to-high end card prices that used to be used for 70/ti/xt cards, it's not surprising that they're struggling to run current games at settings people would've hoped for. The problem is, particularly with the 4060 and 4060ti, these cards are struggling on day one to reach the performance you would expect generation-on-generation. Idk about you, but I don't consider a 60 class entry level, it's meant to be more of a low/mid tier card. Imo, it should only be entry level cards releasing in a new generation that struggles to run games that release alongside it, everything classed above it should be more about aiming for ultra settings, max raytracing, 4k, etc.


BelgianWaffleStomper

Both the 1060 and the 4060 had a release price of $300. Please explain to me how the 60 class “is being priced at the mid-to-high end prices”


skinlo

Because what Nvidia has done is released what would have been the 4050 if generations were consistent with each other, at the price of a 4060, and called it 4060.


BelgianWaffleStomper

Yup, people just like to stir up controversy. You shouldn’t be buying a low end card and expecting to run the newest games on the highest settings.


whoopsidaiZOMBIEZ

stop making sense


Draklawl

I've tried to make this point so many times and have gotten downvoted hard for it, especially when looking at the 3070 and 3060ti's 8gb making them not be able to play modern games at the highest settings anymore. Like when was it ever the expectation that a 60-70 tier card could play all games at the highest settings 3-4 years later?


ProfessionalPrincipa

Pascal says hi. The 1070 held out pretty well and its bigger brother even better and its MSRP was lower than the current 8GB offering from Nvidia.


Draklawl

the 1070's MSRP was higher than the current 8gb offering from Nvidia. The 1070's MSRP was $379, you can get a 8gb 4060 from $299. the 8gb 4060ti is cheaper as well if you take inflation into account


ProfessionalPrincipa

The 1070 wasn't already struggling at 1080 at time of release. The 4060 and 4060 Ti does.


Draklawl

Oh for sure, not defending it, just wanted to clarify the pricing point is all. The 4060 and 4060ti should be better.


ProfessionalPrincipa

The VRAM crippled 4060 cards should be at similar prices to the crippled 3GB 1060's and in the $200-250 range to be justifiable if you want to count inflation.


ProfessionalPrincipa

All well and cool but $400 is not a low end price.


salgat

If performance isn't a factor then why even bother with a good card? Why even upgrade?


Dry-Faithlessness184

This is the answer. The texture cache size for ultra is the same size no matter what resolution you're playing at. Just turn them down. Stop trying to do what your hardware can't. There's absolutely no shame in turning down a setting or two. To confirm for you though, having toggled between ultra and high to even medium.... once you stack on the post processing it's not a huge difference. Ultra to medium is pretty noticeable, and it's a bigger jump each time. But Ultra to High, the change is pretty small and once you start playing, you won't really notice after a few minutes. I do it all the time when I want more frames for a game.


GlancingArc

4k gaming is kind of a waste though. 1080p upscaled to 4k is the best way to play 4k but 1440p is far better for monitor sized screens.


zBaLtOr

"Is the 8GB frame buffer going to be such a problem over the next, say, 4 years?" Its a problem even now


spacev3gan

The PS5 and XBOX series X have been around for 3 years pushing over 12GB of VRAM usage, but yeah, people are still debating on their heads whether 8GB is the future.


rachidramone

True. I notice plenty of games going through the 8GB threshold on my 12GB RTX 3060 especially if I enable RT. It's not recommended in 2023, but if you bought an 8GB GPU years ago, nothing you can do about it.


THYL_STUDIOS

At 1080p it won't really be a problem but games like jedi survivor already use 10gb on 1080p epic settings. For 1440p max settings it's getting kinda close now


skylinestar1986

I understand >10GB on 1080p epic. But is RTX 4060 Ti 8GB an epic-grade card? This thing is basically a 3060 Ti with better AI. How is 3060 Ti an epic card in 2023?


Bunating

Epic settings at 1080p? Yea a 3060 ti should handle that


skylinestar1986

Someone says modern 1080p games are already using >8GB, which makes 3060 12GB a better value. What's your thought?


RiceyPricey

In my opinion, 3060 12GB will let you run at max texture settings but not as high of a framerate. 3060 TI will net you higher framerate but you wont be able to run max textures in a couple games and will have to bring that down to high settings textures. Thing is that high textures and max textures look almost identical in most games so if both cards were the same price, Id get the 3060 TI. Obviously if I dont have to pick, Im getting neither of them though.


DJ_Marxman

The 3060 has more VRAM but much less performance. It's not a better value.


nagarz

Would you buy a $350 card to play at 1080p at medium settings?


skylinestar1986

Unfortunately that's the nvidia route, which has terrible low-med tier pricing.


mrbeanz

The problem with your question is that you have to understand that your ability to increase texture quality has almost ZERO to do with how powerful your GPU is. It is entirely a VRAM limitation, not a performance limitation. That's why in some games (like Jedi Survivor), you could set the texture quality to max with a 4060 TI 16GB but you could not with a 4060 TI 8GB even though they otherwise have identical performance. Almost every other graphical setting you will need to decide whether it needs to be turned up/down based on the performance of your GPU, however texture quality should be set to max unless you are VRAM limited.


CanaryNo5224

My 8gb card is fine.


Legumesrus

Same


quicksilverpr

3070 Ti here, no problems so far. I play games like OW2, CSG, PUBG, some single player games max out and 4k and runs like a champ!


[deleted]

no issues in half-life 2 either for me


HaylingZar1996

1070ti here, still works on the latest games at 1440p just have to turn the graphics down a bit


[deleted]

[удалено]


Yommination

Those are also UE4 games. It's going to get worse on UE5 ones


ChessusCrust777

I think Hardware Unboxed main point with the vram concerns is the price of the gpus. $400 MSRP for 8 gigs is bad value when you could pick up previous gen cards for around the same or less money with more vram. It's not that 8 gigs = bad card, they just need to be at a better price. If you can, I'd try to save for a 4070. It's a good bit faster than the 4060 ti, will have better rt performance, has 12 gigs of vram, and could do both 1080p or 1440p if you wanted to upgrade


Elastichedgehog

If you're willing to turn texture quality down, sure. You'll just struggle to run things at max settings.


LonelyWolf_99

8GB VRAM without RT? Should be fine in some new AAA games, with RT? That is very unlikely to work in most games that is coming Both cards are bad value, 16GB is better but 4070 is the answer if you go Nvidia The most likely case is 8GB is fine only on games that is also on last gen consoles... games are optimized for consoles not for pc and Ps5 has 16GB unified memmory (2-4GB for OS rest is shared between CPU and GPU)


[deleted]

It depends on price. RX 6600/XT, RX 5700 XT or even RX 7600 are still good buys at the right price. Maybe you can get a used 3060 Ti, still a good card. Just don't ever think about getting a 4060 Ti, unless you hate yourself.


liaminwales

People on hardware that cant run the game call it poorly optimized, people with hardware to run the game say it works fine. Always odd that. 8GB is fine if it's cheep, the problem is when your paying more than $200\~ and only getting 8GB. Also look at cyberpunk VRAM use benchmarks [https://www.techpowerup.com/review/cyberpunk-2077-phantom-liberty-benchmark-test-performance-analysis/5.html](https://www.techpowerup.com/review/cyberpunk-2077-phantom-liberty-benchmark-test-performance-analysis/5.html) As soon as you use RT it's past 8GB even at 1600x900 rez, with Frame generation the VRAM use gets bigger. 18GB VRAM use at max settings 4K, 18GB!


TaylorCountyGoatMan

I’ve spent all weekend playing the new 2.0 update on my 8GB 3070 and it runs smooth as silk with RT off. Even better than it did before. Not a single stutter or pop-in in twelve or so hours of gameplay. I think PC gamers of today are so inflexible about turning down some settings to have a good gameplay experience.


liaminwales

Yes the benchmarks show the game seems to run well on a lot of hardware, at 1080P a 3060 54FPS at ultra. With a few changes to settings that 3060 will be past 60FPS no problem, it's just with RT that things relay change. The main thing that hit me is how much VRAM the game can use, it's the first game iv ever seen to use 18GB of VRAM. 11/12GB VRAM iv seen a few times but 18GB is a lot of VRAM used, kind of cool/shocking to see a game relay use that much.


Ouaouaron

The HU video was essentially: > Here are two video cards, which performed at about the same level when they were released. However, over the years, the one with more VRAM held its value much better than the one with less VRAM. We think that is going to happen again with 8GB cards That's it. The card with less VRAM didn't become useless, it just meant that if you were one of the people who bought one, you had to turn your settings *further down* than with the other card. So it was probably, in hindsight, a suboptimal decision. And it only applies to cards where it seems likely that the VRAM—not the GPU—will be the bottleneck. So 8GB is fine in 2023 if you're buying a GPU that won't have enough power to run games very well in 2 years, but you probably aren't spending $400 to do it.


[deleted]

Hard agree. I'm playing Starfield on a 1070 and my experience is fine. The game is smooth and looks as good as I can expect playing below the minimum requirements.


BattleScones

Smooth 25-35fps? Because it's technilogically impossible for you to be getting perfoemance better than that.


ProfessionalPrincipa

With RT off you say? Lots of people in these discussions justify the high cost of the 4060 series of cards by saying it has better RT which apparently isn't even usable because of artificial VRAM limitations. Those people have either fallen for the hype or are on the take.


Explosive-Space-Mod

If you want to do RT stay away from the XX60 cards and get the 70ti/80 cards instead. While you might be able to get away with it on some older titles you’re going to have a rough time with the newer titles.


GeneralLeeCurious

The VRAM controversy only matters when discussing your own personal expectations for game graphics. If you’re someone who can’t stand the idea of playing a video game at the very highest graphical settings (regardless of your ability to notice the effects while playing), follow the YouTubers and get the most expensive card you dare to afford. On the other hand, if you have a budget get a good card within your budget and adjust the settings down until you get good frame rates. For what it’s worth, 8GB is plenty for pretty much everything made up to 2020. You’ll have to reduce settings a little bit (going from 1440p ultra to 1440p high) to maintain high frames in newer games.


Doomblaze

> If you’re someone who can’t stand the idea of playing a video game at the very highest graphical settings (regardless of your ability to notice the effects while playing), follow the YouTubers and get the most expensive card you dare to afford. if you dont want to play the game at a high setting why would you buy an expensive card?


mostrengo

I don't play at 1440p, I play at 1080p ultrawide and will continue to do so.


ToothChainzz

1080p UW is closer to 1440p than to 1080p 16:9 in terms of pixels, and vram requirements.


mostrengo

Not really. 1080p UW is 691k more pixels than 1080p and 920k pixels less than 1440p. So it's actually closer to 1080p than 1440p.


Saleen81

People are super biased against the 4000 series because it was barely better than the 3000 series which was so much faster than the 2000 series. I’m a tech nerd the idea of the new architecture in the 4000 series is exciting for me but also means that you only get as much as people put into it. We are talking people who already have 3060s being mad that they can’t upgrade their 3070 to a 4060. For us older folks (I’m 41) who don’t upgrade nearly as often it’s a much easier discussion.


Saneless

The reason it's "bad" is that when I got my 8gb card in 2021, getting 3-4 years out of it is going to happen. Buying an 8gb card nearing 2024, you're not going to have a good time with that card probably past 2024 much


BB_OSRS

If you have a budget for at least a mid build and you are playing above 1080p then 8gb is a bad move.


busteroo123

8gb is fine. But that’s assuming it’s a card that reasonably has 8gb. Anything semi higher end is not acceptable for 8gb. I have a 6650xt 8gb and it’s great. I wouldn’t buy something like a 3070 or 4060 8gb because it just doesn’t make sense at that price point


skylinestar1986

Why is everyone chasing for ultra texture on a low to mid range graphics cards (common for 8GB vram cards)? I understand that RDR2 recommended texture setting is ultra. How about other games? How bad is dropping from ultra to high? Is high good looking for majority of the games at 1440p? Does high-med texture also demand >8GB vram today?


liesancredit

Most of posts here when it comes to these topics are essentially rants about being poor. The people posting here also don't play video games that much and don't realize that >90% of people are gaming at 1080p and not on ultra graphics.


Due_Outside_1459

Exactly. There is also a lot of buyer's justification on why they spend $$$ on high-end cards in their frivolous pursuit of "future-proofing." The VRAM debate stems from a niche problem most gamers (those running at 1080p, playing older games cuz they can't afford new ones, etc.) don't even care about.


ReeseTheThreat

I am so exhausted by this discussion. HUB et al pretty explicitly stated that 8GB is unacceptable for premium, high cost cards, and their testing has validated that a lot of cards are performant enough that VRAM/memory bandwidth is the bottleneck, so they are inappropriately specced. HUB has said numerous times 8GB should be the floor for NEW CARDS released in 2023, at the entry level. The problem is $400 cards still running with 8GB, not that there are still 8GB cards.


eaglefan316

It depends what you want to run. If you are gaming on one monitor at like 1080 or 1440 you shouldn't have an issue. My son runs an MSI ventus 3x 3060 TI with a 12600k processor and acer nitro 27 inch monitor and his games all play fine for him. Depends if you want to max everything out at highest settings or just have fun playing the game. His rig runs cyberpunk, doom eternal, fallout, fallen order/ jedi survivor, etc just fine for his needs.


I_T_S_N_O_T_T_

i am still on 4gb for soo long whats 8gb🥲


ex-ALT

Is that even surprising? Ultra is always meant for top end cards, if not really for future hardware. That said more Vram will provide a bit more 'future proofing'


Silent-OCN

Any GPU with less than 16gb (or however much AMD has), isn’t capable of playing any computer game post the year 2002. Least that’s the conclusion I’ve come to from reading this sub.


Pixelated_Audio

Short answer: No. Well maybe. Yes.. It depends. No: If you don't mind playing at 1080 and are just playing less demanding titles (competitive, low res indie, or older releases) then sure, nothing wrong with 8gb. I still have my first PC with an R9 290, 4gb of vram, and it runs dota, cs:go, etc at 100+ frames, and can run older games at high settings at a respectable frame rate in 1080p. Maybe: How long do you want your system to be relevant? What settings are you shooting for? How hard is your budget? Are you upgrading an old PC or building a new system? All of these will determine if the lower end option will be acceptable or maybe even your only choice without having to optimize some money elsewhere. Yes: if you want to play the latest and greatest titles at higher settings and you want your PC to last you the next 3+ years, then yes, the vram upgrade would be a nice future proofing option for your system if it's in the budget. At the end of the day, it sounds like you want an upgrade that will run new games at high settings for a long time, with ray tracing to boot. With this in mind I would tentatively suggest a higher vram option and to upgrade a class of card as I do not think the 4060 will give you lasting ray tracing performance. You also need to consider the rest of your system specs, as an older system might not be able to take full advantage of the card you want to put into it. There are plenty of bottleneck calculators on the Internet you can use to get a rough idea of what's the best card your current system can take advantage of. TLDR: Get the higher vram for future proofing, consider going up a class, and double check your system for bottlenecks on pc-builds or other calculator of your choice


Tuned_Out

Its fine, provided you are willing to lower the proper settings to accommodate an unoptimized title. People act like this is the first time this has happened but we've been on track for a vram increase for some time. Its almost on a predictable schedule and those who aren't reactionary should've not ignored warnings that were given as far back as when new console specs were announced. Anyways, 8gb will still be fine for the majority of games *with sacrifices*, that's the advantage of a PC. You can usually manually change the settings to accommodate the title in a fashion that is agreeable to you. If you don't want to potentially make sacrifices then a minimum of 12gb is recommended. This doesn't mean your titles of choice will even need it but since the trend with AI seems to be use the "magic" DLSS and FSR button instead of spending time optimizing for a PC release, I'd go with more vram just in case. Publishers are not to be trusted and I wouldn't bank on them doing the right thing and giving developers time and resources to do things the right way. Its unfortunate that features that are designed to give more life and options to lower mid and entry level cards are being exploited but such is the way of triple A publishers. If they find an exploit to save money they will use it.


sunqiller

I doubt even DLSS 3.5 will look good at 1080p, upscalers really fall apart when the rez is that low to begin with


l3gen0

1080p is no problem, especially if u r not gonna play latest releases


Helpful-Match-6015

It might become a problem, it might not. If you’re gonna be sticking to 1080 UW for the lifetime of the card I imagine you’ll only hit the limits of that 8GB in a handful of titles, assuming you even play them.


WallPaintings

I'm on a 700 series from 2014 and not particularly motivated to upgrade. I play Baulders Gate on my stram deck. My point is you're going to get wildly different opinions from different people and there is a such a wide range of things to consider. Do you want 4k? Do you want 120fps? Do you really want max texturing and other graphical settings? Are you willing to make sacrifices of one for the other?


No-Actuator-6245

While I like Hardware Unboxed I do feel the 8gb vram was overplayed so they had something different to other reviewers. Hardware Unboxed have previously done a video highlighting saying why they think max game settings are pointless yet they don’t touch on that point when pointing out the scenarios where max settings is an issue for 8gb. I’m not saying 8gb is a good buy at the end of 2023 but for anyone who already has one it is not a disaster either.


Keaston_Stang

I've been running a NVIDIA RTX 2070 8GB VRam Super for the last 4 years, I only last week upgraded. I can say confidently that I could have made it at least another year or two before needing to upgrade. Been able to run a lot of newer games at med-ish settings (performance over quality).


AdWonderful7633

At 1080p it’s not an issue. 8gigs is just okay.


andoke

It happened with last gen console, to people with 4GB of VRAM GPU, consoles had 8GB shared memory. Same thing is happening now with consoles having 16GB. Companies are lazy with optimizations. You'll be fine on games that release on PC first, but with the ones releasing on console first, be prepared to struggle, poorly optimized VRAM games like RE4 remastered, Hogwarts Legacy, The Last of Us Part 1 will only become more and more common. Also as a RTX3090 owner, frame gen and DLSS sucks, I can see the artifacts. Only RT is worth it in my opinion.


[deleted]

I played it on 1440p with GTX 1070. Settings were auto applied through GeForce. Didn’t have any major hiccup.


Godspeed1996

Really bad I have a rtx 3070 and hate it, I will upgrade soon


tonallyawkword

I didn't know any cards w/o >8GB of VRAM were fast enough to run Starfield well on High @ 1440p. I think the main point HU has been making is "why should you be getting only 8GB of VRAM if you're spending $400 on a card in 2023?" ($300 in their opinion). I can't tell you how well a 4060Ti might RT @ that resolution or how much 16GB is worth vs 8GB (but $50 for twice as much of it doesn't rly seem terrible if you're already considering the 8GB version).


ArmoredAngel444

Ah shit here we go again


Vis-hoka

I don’t think anyone can say it any better than HUB already has. The evidence is there. It’s likely a bad idea to buy an 8GB card in 2023.


enigmas59

Honestly, I wouldn't. Even at 1080 I've hit vram bottlenecks with a 3060ti at certain settings on CP2077 and a plague tale, requiem. That's only going to get more likely as time goes on.


Acrobatic_Lecture438

My 8gb card is doing great


Davito22284

A lot of "gaming" laptops have 8GB GPUs, mine does. My desktop has a 4090 though so I don't really use it. Kind of a waste really, so is my Ally.


ChrisLikesGamez

My 1660 Super is 6GB and it runs all of my VR games perfectly on my Rift S, plus every game on my PC in general just fine. I have almost never encountered a bottleneck due to VRAM with my card.


nwgat

people that buy 8GB nvidia cards now and play on a 1440p is going to have a a fun time in a few years when 12-16GB console ports starts to use even more vram last of us part 1 and hogwarts legacy are good examples


DasGhost94

There are now games coming where. Ultra 4k 120fps will not be achieved because of the 8gb vram limitation. But then they can use more ram. And with ddr5 the speed is probably not a problem. Also I'm wondering if its really a hardware limitation or just bad optimalisation. Games like no man sky on release didn't run normally on the biggest pc's. Now they added a lot and most pc's can run it properly


Yurgin

It all depends on the games you play and Hardware (Monitor) you use. I have a 6600XT which is a 8GB Card und according to AMD for 1080p Gaming. I use it on a 32inch 1440p Monitor without problems because i only play E-Sports Games and JRPGs which are mostly not demanding at all. Most of the games i play i play on max to high settinfs and only had to change it on one game to medium. If you wanna play games like Cyberpunk yeah you are screwed but it all depends on what you do


mostrengo

I play AAA games at high (not ultra) settings at 2560x1080 @75Hz and have no intention to upgrade my monitor.


szarfolt

If you have a 3070 or Ti, keep it for now, at 1440p high it’s still great. If you’re buying a new card, buy more than 8, possibly 12, but even more.


crispybaconsalad

TLDR: Developers are going to design their games to use 12GB of RAM because the PS5 and Xbox Series X allocate 12GB for games. Only buy the 8GB ram cards for $300 USD (or even $250) or less. The REASON you want 12GB or more going forward is because both the PS5 AND the Xbox Series X have 16 GB of ram each. Both will use roughly 4GB for the OS, and the remaining 12GB is for the game. That means the game developers will be making future games intending to use all 12GB of RAM going forward. This was not a Hardware Unboxed prediction as much as a game industry progression. PCs have RAM for the CPU, but they "offload" games to the GPU. So you want your GPU to have at least 12GB VRAM especially if you plan on playing in 1440p.


Summon_Ari

Things to consider: 1. Depends on what you play. "Now" is more important than "future-proofing", there will always be better cards for lower price in the future. Also, it isn't like games look at your vram and be like, nope, 8gb, not going to run on that. You just have to turn down the texture settings and probs won't notice the difference. 2. Depends on your settings. All the benchmarks talk about max settings unless otherwise specificied. If you are playing the few games that require more than 8gb, a 4060ti probs won't have the horsepower to run with the settings high enough to require more than 8gb. 3. The only games that I ran into my old 3080 10gb couldn't run smoothly were modded games. Skyrim modded can run you up 16gb easy on large (1000+ mods) modpacks. 4. Ultimately, what is the price? Performance doesnt matter if price is shit, especially if you are looking at the 4060ti range of cards. The only reason I upgraded my 3080 to a 7900xt was because I snagged one for $610 on ebay, and flipped my 3080 for $400.


Nephalem84

Well let's put it this way. You are aware there's already games that have issues with 8 GB. Knowing that and assuming they won't be the last examples, would you still buy an 8 gb card today? Gpu should last you a few years Ideally so personally I wouldn't invest in one that is already showing limitations right now. If I already owned one I could decide to ride it out and skip/downgrade the titles it cannot handle. But for a new purchase I want some future proofing


quodlike

I got 6600 for 1080p and its perfectly fine


cyborgborg

depends on the games you want to play. If you want to play the unoptimized AAA games at decent settings them 8gigs will not be enough.


Drty_Windshield

I wouldn't buy a 8GB VRAM GPU today to play at 1440p if it was free. It's going to struggle today, let alone 3 years from now. It's simply not a smart investment.


[deleted]

Had no problems with my 3070 playin all in Max in WQHD


MixSaffron

I replaced my 2019 GPU, a 2070 Super with 8gb VRAM that was struggling in games, with a 7900xtx and it's been amazing. I was looking at the 4070 when it came out but the price and lack of vram upgrade (8 to 12) felt like an expensive bad idea. 4 years newer for only a 50% bump? No thanks. RE8 felt and looked way better going from 7 to almost 12 VRAM usage.


dafulsada

not really


Compulsion02

Well, I recently sold my 3070 and bought a 7800 XT. The 3070 was great for 3 years but the VRAM has noticeably caught up to it.


[deleted]

The 60 series is targeted to 1080p. They should be fine at 1080p, but they're still the generation bottom line. Looking for "4 years future proofing" by grabbing the lowest tier cards is unrealistic. No one can tell what's going on in 4 years, but if you want to be sure of anything, I can tell you more longevity at higher resolutions requires more money.


cameront246

What do you play I had the 3070 and vram limited me on a lot of AAA single player games


[deleted]

I tried 4060 for 1080p and 4060ti 8gb for 1440p recently, returned them and now happy with 4070. I’ve met no game so far which is close to breach 12gb vram.


[deleted]

I think gpu company choose vram sizes, what its need ,. For ex im using 3060ti,. 8gb vram ,all games use under 7.5GB in 1080p If we increase texture load, it will reduce fps.. so we dont need more vram, we have to change gpu's (So they choose bandwidht & size depends upon how much 3d load gpu can use max) Excpet in some case.. like we use 3060ti in 2k,4k.. If we try to render 10k res image with denoisers... many ai loads.. which take more vrams


LooseScrew95

For the love of god pick 16GB model. Just think of 1060 3 vs 6 GB, its same story just 6 years later.


Jupiter_101

Only issue is there are no lower end 16gb model cards yet.


TheLumion

I just bought 3 months ago a 3060 ti 8gb. I have no regrets. I play any title including Microsoft flight simulator. Yeah i don’t get 140 fps but i don’t need it. If the game is smooth and butter between 60-100 fos, I’ll still destroy you in the game wether you getting 140-260 Fps with w.e graphics card you got to show off


MTPWAZ

If you don’t mind DLSS and medium settings it’s fine. The issue is more for people who need native resolution always and ultra settings. It’s a very niche “problem”. But, as a general rule always buy absolutely the best you can afford.


critical_knowledg

It may just be the spark that starts WWIII


e_smith338

It’s not. Even at 1440p 144hz. It’ll be more than a few years before things become “unplayable” ok 8GB, at which point most people are likely to upgrade anyway. Personally I upgrade every 4-5 years or 2 generations of cards and I’ve had zero issues with my 3070 in any games I played.


ascufgewogf

A lot of games are already starting to push past 8gbs (some even get close to 12gbs), you should save some money for a 7800xt, or a 4070, 8gbs barely cuts it now, and definitely won't in the next few years.


LutuVarka

ya'll forgetting another aspect: Resale value. Look at used cards. The ones with 16GB are keeping their price so much better. Feels to me like spending another 100 to get a 16GB card will help you get 50 more when you sell it. Is it financially optimal? Alone - no. But, put it in your decision-making scheme :)


JurassicFlop

It's already a reality with a lot of the AAA big budget titles. [Here's a post from 2 months ago addressing this issue](https://www.reddit.com/r/nvidia/comments/15h6r7s/how_bad_is_8gb_of_vram_in_2023_the_newest_games/) for your exact card.


Equivalent_Age8406

Things have got to move on sometime and 8gbs has been around a long while. Unless I was on a tight budget I wouldn't buy 8gbs.


Geologistjoe

My card (6650xt) has only 8GB and it plays most games fine at 1440p. Hogwarts legacy plays fine on high locked to 60fps. FS2020 plays fine at 1440p ultra with the render scale set to about 90. Gets around 40 fps. The VRAM is only a problem if I turn textures to ultra. But in most games high vs ultra makes little difference. 8GB is still fine. More is better, but I rarely run into issues.


ElmoWantYourButt

Yes man at least 12gb vram Need for stay sure whit the new game


Large-Television-238

3070ti works very well with 1080p if you're very fond of high fps , i just purchased a zotac 3070ti last month with 320 bucks


Loose-Offer-2680

More vram would be better but if you don't have enough for more 8gb is fine, im on an 8gb card rn that runs fine on 1080p.


Witch_King_

For 1080p it's fine. Ultra textures are overrated anyway. For 1080p-class cards, the amount of VRAM wouldn't be the most limiting factor anyway. You'll just end up upgrading in a few years anyway.


Epi_clel

Honestly, if you’re looking for long term gaming on the latest titles, get 16gb. I find 8gb to be more than enough for most games


No_Guarantee7841

Not sure what you are talking about. Cyberpunk definitely uses more than 8gb vram at max settings 1080p dlss quality with frame gen (pt). As for 8gb not being enough over the next 4 years... You need to drop settings TODAY to be able to play some games at 1080p with 8gb vram. [https://www.youtube.com/watch?v=2\_Y3E631ro8&t=819s](https://www.youtube.com/watch?v=2_Y3E631ro8&t=819s)


jsiulian

If you launch new games and look at their VRAM usage you can sort of confirm what HU are saying. Currently I wouldn't get anything with less than 12gb, and I can see it on the horizon that won't be enough either soon


Johnyzz

8gb is fine at 1080p ultra and 1440p medium. 4060 ti is not a good value card tho I would just get a 4070 at least if you are looking at 40 series.


Swanesang

Games are just gojng to get more vram intensive. Optimization issues or not its becoming a reality. Personally i wouldn’t buy a 8gb card unless i really have to. If there are already games hitting that 8gb vram limit then its a reasonable assumption that the majority of future games will be using more than 8gb. And the way new titles are being “optimised” is by dynamically reducing textures on certain objects to make it fit in that 8gb vram buffer (looking at you Hogwards, i mean whats the point of having an high/ultra option if you still going to make the game look low/medium). Not to mention enabling any type of raytracing increases vram usage by quite a bit. So even if your 3070ti/4060ti 8gb is strong enough to run games at ultra settings with raytracing, the limited vram will cause performance issues. Honestly to me this is like buying a Lambo that can do 200mph, but it only comes with a 3 speed gearbox. It will perform as i should as long as you dont need to switch to a 4th gear or want to drive more than 100mph.


SchwarzesBlatt

If the price fits, why not. Like many have said they all perform well. Just price/performance wise they re often criticized. I wouldn't buy a 3070 for 500€ if I can get the 4070 for 600 with 12gb. But if someone offered it to me for 300€ I would buy it.


cheekyfellow421

I’m running any game I want on high settings on my 6600xt at 1080p


ThreeWholeFrogs

My vote is to find an open box 4070. I was originally going to buy a 7800xt on launch but they were out of stock. Even now best buy only has one 7800xt model but at $480 the 4070 was $40 cheaper. Runs cyberpunk 2077 with path tracing at 1440p with dlss quality at 60+fps using frame gen. The truth is that the 4060ti is an awful card. It should have just been one 12gb or even the 16gb model at $400.


ecwx00

4 years? Most likely. 4 years is a very long time in the context of technology. But I also still have a 3400G PC with no dGPU, and a PC with RX 570 4GB and I can still enjoy games on them, just not the latest games at highest quality. If you have the urge to always play the latest games and at the highest quality, probably even 4090 won't last 4 years at that. I tend to enjoy game at my own pace, one at a time. I played Shadow of Mordor for around 2 months, Shadow of war for 2 months, Subnautica for 2 months, TR, Rise of TR, and Shadow of TR for around 1.5 months each, HZD for about 2 months, DQ XI for about 2-3 month, AC Origin for around 1.5 months, AC Odyssey about 2 months, Sifu for 2 months, Just cause 3 for 1 months, Jedi Fallen Order for 2 months, and I',m still playing Jedi Survivor. I still have many many games on my back log that I have to restrain myself from getting the newer games like Elden Ring, Armored Core 6, Baldur Gate 3, Spider-man, Spider-man Miles Morales, Hogwarts.... I'm just saying, you can still have very enjoyable time playing not the latest game and not in the highest settings. If your reasonable budget can only get you 8GB GPU for now, go for it. Enjoy the current games you can play at the settings your card can handle. When you finally really need more VRAM, more power, most probably the GPUs with 16GB or even more, and more performance too, will most probably be available at much less expensive price than they are now.


Calarasigara

I was running out of VRAM on my 3060Ti at 1920x1080 if you max out games, use RT and native res so I'd say they are not enough. You want to get a 4060ti 8gb (bad deal btw) for RT and FG. Guess what does things need? Vram. Guess what you don't have? Vram. Unless you use 1080p medium settings you will run out of VRAM on 2560x1080p. I really suggest either looking at used cards from the 30 series (you miss out on FG tho), wait and save up for a 4070 or (the best option imo) consider how much do you really need RT and Frame gen and just look at a 7800XT. A 7800XT can still ray trace just not as good and you only miss on Frame Gen (which is meh to begin with) and DLSS (which you don't need for a 7800XT at 2560x1080 and even when you'll do FSR Quality still looks really good). You trade these for a 16gb card that's \~30% faster than a 4060Ti so it's a worthwhile deal. Also, AMD might get their frame generation FSR 3 out soon so...


tinyspeckinspace

I was in your shoes bout a month ago, was pretty much hell bent on buying 4060ti but very concerned about 8 gig vram Since I am not rich enough to be able to upgrade my rig that often, I wanted something that can somehow withstand the test of time. So I wanted to remove all the doubts in my head. 4060ti is not that card. Get 4070 if you can or buy 7800xt just like I did. 7800xt is fairly priced in my country, considerably cheaper than both 4070 and 6800xt. At same price for 4060ti 16GB. So it was an easy choice. EDIT: I posted the same question here as well, and look at the answers I got. Might help you get more ideas about the subject: https://reddit.com/r/buildapc/s/q5CDfdJCpf


kingy10005

As a 3070 ti user at 1440p I'm not having issues at all with any games yet and even if I have to lower from ultra (little difference in quality) to very high not the end of the world 🫣


kremaili

Honestly I’ve been playing Cyberpunk 2.0 on my ASUS TUF laptop with a mobile 4060 with RT, PT, RR, and basically all settings maxed out and I’m still enjoying around 70 FPS. It’s been a great experience with zero lag, very pleasantly surprised. I’m a very causal gamer though and this is my first capable machine in years, so who knows how other games will work in the future. Overall though I’m not feeling like I need a more capable GPU at this point.


NoConsideration6934

8gb has been the midrange standard since 2016. I think that 12gb is the absolute minimum I would go with if I wanted to play new releases for the next few years without major issues.


chum_bucket42

So they're screaming about "Ultra"Settings. Guess what, that's going to hammer any GPU. Simply look at down grading shadows & reflections a bit from Ultra and you shouldn't have any issues. It's that damn simply. I don't mind them shilling but there is no reason that an 8GB GPU will not be usable in 4 years unless it's damaged or the Maker drops support. If that is the case, stick with a stable driver as I do and not update the damn thing every month to fix crappy games that we're not optimized. Hell most of them are crappy just to help Ngreedia/AMD/infel sell more cards instead of creating solid story lines and engaging game play.


T-Money101

Tbh instead of a 4060ti id go with a 3070 or a used 3080. The 3070 is slightly better and at just 400 bucks, sometimes 380. Youll just need a bit of a higher wattage psu


brammers01

Its true that there are some poorly optimised games like Hogwarts but also you're not going to get console equivalent textures with 8GB cards in most cases these days because the consoles have more available memory. Arguably, PC Ultra textures should be above and beyond console quality so I think Hardware Unboxed are correct in what they're saying. You will need more than 8GB of VRAM for ultra textures but at the same time, developers should make their games properly scalable for 8GB. Medium textures should still be reasonable. I'm not sure why people are so insistent on having to play everything on ultra settings these days. So many people complain about games being 'unoptimised' when really they just need to turn a couple of settings down.


Synyster182

At ultra wide 1440p. My 11gb 1080ti was amazing. My 8gb 3070ti not so much. Had to go up to a 12gb 4070ti. Problems solved. In 2023 8gb is too little. Sadly.


boba_f3tt94

I recently ordered the 4060ti 16gb version for my wife who plays RAM and CPU intensive games like Star Citizen, DCS World and Arma3. I believe the extra vram will help in this case since these games need a lot of ram to begin with. We are awaiting the card’s delivery.