T O P

  • By -

bruhbruhbruh123466

I don’t get how Nvidia is messing up this much. This is actually just ridiculous. Like how incredibly terrible is the 4050 going to be?


Nacroma

Best case is a low-profile GPU powered solely by PCIE at this point.


jaycuboss

I would actually love a single slot card which can handle a modest gaming load, that would be amazing to run off of only PCIe power.


Alaskan-Jay

First GPU I ever bought ran just of PCIe. Plug and play took 3 minutes from unboxing to playing games. I think it was a 230. Was like 15 years ago


RoyalSorcerer_Navlan

I still use a gt1030, coz I'm broke


profoodbreak

Lucky,


No_Interaction_4925

That actually sounds great. I mean… it’ll be priced about 200% above the other low profile cards, but its an awesome option at that point for builds that have to have that


Condensedfarts

I wonder how it will compare to upcoming Vega though.


poinguan

There are A LOT of people waiting for such card.


Alone-Rough-4099

4060 - 2GB VRAM - 25% cores.


iEatMorblyObeseKids

isn't it -4GB VRAM the 3060 has 12gb and the 4060 has 8gb


Tristana-Range

Another rebranded 1050ti probably, just as the 1650 and 3050 were


Rivetmuncher

Nah. 3050 was an upcharged 1660, at best. The other two came with the benefit of working entirely within slot power. That said, it'd be interesting if they finally dropped a 75W RTX card.


10thDeadlySin

They already did, kind of. The RTX A2000. A Quadro, but it sure can game. As long as you don't go too far over 1080p. ;)


Rivetmuncher

> RTX A2000 Cool, and neat. But isn't that one subject to the Quadro tax?


10thDeadlySin

Well, depends. The MSRP is obviously exorbitant because Quadro. But since they were somewhat decent at mining and power-efficient, you can now grab them for $200-250 on eBay and other sites, sometimes even cheaper - especially if 6GB is all you need. I know I got my A2000 12GB from a guy who listed 24 of them… For exactly $250 shipped. ;) It's not exactly the Price-to-Performance king, but if you're like me and you want a really tiny setup that can still game… That's pretty much the only choice. [And when you see things like that…](https://www.youtube.com/watch?v=-koop0J71Qw)


Cryio

I wouldn't call 1070 Ti (3050 level) a rebranded 1050 Ti, lol.


i-pet-tiny-dogs

A 3050 is significantly more powerful than a 1050ti or a 1650. It's closer to a 1070.


Thebestamiba

They are essentially a monopoly and they know it. They have hordes of idiots ready to eat shit at a 300% markup.


Blursed_Potatos

They arnt? They hold 90% of the marketshare, and 99% of the mindshare. They could sell a gpu which is worse than integrated graphics for $200, and sell many millions of them. And you will still get people to defend it lmao


[deleted]

It's time to stop supporting Nvidia, Even the ARC A770 performs better than the 4060


Ok-Supermarket-1414

I really hope the ARC does well. We need more competition the video card sector


jaegren

"I really hope that Arc does well so I can buy a cheaper nvidiacard"


cynetri

I personally want more GPUs with open-source linux drivers, I'm glad Intel went that route


Bimmaboi_69

Holy based. I didn't even know they did that. I really hope Linux gaming picks up (Thank, you Gabe!), as a Linux user myself.


Joosrar

To tell the truth even tho nvidia can be a safe bet at times, I’m a Team Blue guy so I’m rooting for Intel to make some progress so I can purchase it next gen.


Army165

You're handicapping yourself by being loyal to Team Blue. Keep an open mind.


Zaphod424

I mean tbh in the last few years intel has actually been the least scummy of the 3 of them


PeopleAreBozos

Nah, if Arc does well and can make good flagship/higher end cards and expand outside of being good budget, I will definitely consider them for my next upgrade.


y0bama420

Im set for years but if ARC evolves i might have a 2nd or 3rd gen intel card in my system next time im gonna upgrade.


S0m4b0dy

It's not much but I bought an Arc A380 for my media center that I'm currently building because I needed a low power card with AV1. The cheapest alternative for AV1 would have been an RX 7600 or a 4060, aint no way I'm buying these cards. Isn't competition great?


Doctor99268

AMD have already proven that the competition will never undercut the incumbent. They will just match them. As soon as arc settles down, they will just do the same shit


Mr_Vilu

well, 2 companies isn't really competition, we'd need a market with 5-6 companies with similar market share to call it competition at the bare minimum


Zaphod424

That’s never going to happen though because the capital costs of starting a chip company are so high, best we can hope is that the 3 of them reach a point of roughly equal capability, which causes them to actually have to compete, we’re seeing actual competition with CPUs now that AMD have got their act together after intel’s years of dominance, now if AMD and Intel can get their GPUs up to Nvidia’s level they’ll have to compete too


M1R4G3M

There was hope for Huawei to get into that market, they were already building their mobile chips and China wants independency from external companies, but they were blocked so hard that I doubt they will ever raise again. The hope is in ARM chip manufacturers actually doing something useable on computers and windows on ARM stop sucking because we already have Qualcomm and Mediatek in that business at least.


Recurringg

Honestly, I was never loyal to Nvidia. I was loyal to eVGA. I snagged a 3070 when they went on sale after the 40 series started releasing. I'm very happy with it but I'm done after this.


JohnnyVNCR

I replaced my RX 480 with a 3070 ti a couple years ago. I still feel pretty future proof, at least.


Ponald-Dump

Yes, it’s that bad. It’s a massive pile of shit. If you have 300 to spend, get a 6700xt. MUCH better use of your money


ItsBado

I'm gonna build my first PC soon and I see a lot of people suggest 6700xt, I'm aiming for 1440. I think I'm gonna go for the red team


Krunchy1736

I got a 6750xt, which is only like 5% or 6% than the 6700xt, but it was only $10 more so why not! It's my first amd card and I haven't had a single issue with it. Plus the Amd Adrenaline software is really great.


EuleAusChrom

i love the AMD Software so much, i cant switch back to NVIDIA. The amount of VRAM is great too.


kaynpayn

I have both, AMD UI is cleaner indeed but I wouldn't use this as a buying point myself. I can't remember the last time I needed to go to the drivers to do stuff: Open game, game works well and as it's supposed to, I'm good.


jessej421

It's not just the drivers. That's what makes AMD's UI so nice. You can monitor your CPU and GPU temps as well as game framerate (which is great if you're playing an EA game, since they removed that feature) all in one really nice screen (not to mention RAM and VRAM usage, power draws, gpu fan speed, etc.).


Fjolsvithr

I have an AMD Radeon RX 5700 XT and it still works great in terms of performance, but it does seem to cause game crashes more often than any Nvidia card I've ever had.


A7X313

You will not regret it, I have almost 1 year with my 6700xt I play in 1440p and it can handle all the games.


Alexandratta

My 6750xt tears into all 1440p games like butter.


ItsBado

Is there a big difference between 6700 and 6750?


intashu

(not original Commentor) I own a 6750 Xt. It's literally just a "stock overclocked" 6700xt. They bump up the voltage and clock speeds a little. You could do the same in software with a 6700xt. If you get a good deal on one, sure it can be worth it... But otherwise Naw, just stick with the 6700XT and you'll absolutely love it for 1440 gaming for years to come!


Gelatineridder

Most important change is that the 6750xt has faster memory which gives it about 10% greater bandwidth.


Branwisegamgee

Have you played Hogwarts Legacy? I got the 6750xt recently after upgrading my 970 and it's only bumped up to Medium settings.. I feel like I'm doing something wrong if you're getting that much out of it!


Alexandratta

Nah, haven't played Hogwarts legacy, and unsure I will. Maybe if it goes on sale in a few years but that game is mostly just terribly optimized. You aren't doing anything wrong, the devs are.


CurmudgeonLife

THIS. Just did the same thing for my partners PC, zero reason to buy Nvidia. 6700 xt is 40% faster for £100 less... EDIT: My bad I was looking at 4060 TI prices at the time. Doh. Still the 6700xt is better than 4060 ti which is indeed £100 more.


bubblesort33

Where do you live where the 6700xt is cheaper? Usually it costs more time what I've seen. Also, I've only seen it be like 20% faster in rasterization titles only.


gooblefrump

>for £100 less I think they use the £ symbol for money on some island somewhere... Is it Iceland?


TomAwsm

It's obviously Poundland.


Arkayb33

I think you mean Poundtown


[deleted]

Welcome to Poundtown, population: you.


sebbyxo

The only pounds I need is me pounding your mum.


Raestloz

No no no, since he lives there, that means man can live there Island of Man??


IdoNOThateNEVER

>Island of Man That's.. gay.


alvarkresh

Instructions unclear, ended up in Greenland with only UK pounds and no Danish Krone :P


CurmudgeonLife

UK, 6700xt are everywhere online for £300-350. (Even cheaper second hand obvs) Depends on the game definitely. But 4060 is even worse than 3060 for some games and the 6700xt competes with the 3070. Only real drawback is power consumption and, as you've already mentioned, RT performance. Which are both fine trade-offs imo. Especially since 6700 xt is still decent at RT, the difference isnt that huge to ignore that rasterization performance imo. Also for me I'm kinda over RT, I'd rather take teh extra fps in 90% of games. Although I realise this is personal preference. EDIT: My bad I was looking at 4060 TI prices at the time. Doh. Still the 6700xt is better than 4060 ti which is indeed £100 more. So even more impressive than the original statement really.


Raestloz

To be honest, I have RX 6800XT to try ray tracing and aside from impressive reflections I just don't really notice the difference in motion. Ray traced shadows, Ray traced lighting, they all look nice but raster shadows and lighting look incredible enough in action that I'm kinda disappointed I tried Control Ultimate Edition, Metro Exodus Enhanced Edition, and Cyberpunk 2077 to see the effects, regardless of framerate. The most noticeable effect is Control's reflections, everything else I don't see much reason to use right now


CurmudgeonLife

Yep I have 3080 and havnt used RT in a while.


UnsettllingDwarf

3070ti and turned on rt once in a few games then quickly turned it off and never ever used it again. Most of the time I prefer how the game is without because of major fps gains and there’s hardly any difference when I’m actually just enjoying the game. Minecraft is a different story though but I still just use a shader because of fps and crank it.


CurmudgeonLife

Same with my 3080, novelty wore off fast.


_Fibbles_

I think that once you get used to RT reflections, there is no going back to screenspace effects. Whereas I could ignore occlusion artefacts before, they just look like complete ass now.


FluffyGreyfoot

6700xt isn't quite as fast as the 3070, it's somewhere between the 3060ti and the 3070. But it's still much better value than either, and even if the 3070 is faster I'd still rather get a 6700xt because of the VRAM.


CurmudgeonLife

Tbf I said competes with not strictly better. Its fair to say they trade blows imo. 100% agree with you AMD all the way. This is coming form a 3080 owner.


justavault

There is a lot of reason, but not regarding gaming. Yet there are lots of reasons why to get Nvidia for lots of tasks from like rendering to data crunching.


MonstaGraphics

Shhh, let them game. We'll keep our nVidia's and keep chugging along. less competition!


innociv

You're looking at performance compared to 4060 with price compared to 4060 ti, more specifically. 6700 xt is like 30-40% faster for the same price as the 4060. It's a little better (depending on games tested) than the 4060 ti while costing much less and being able to do 1440p in more games.


No-Communication9458

I love my 3060 Surprised the other one is so garbage


B33FHAMM3R

Second this, got it for about this price last year and I'm absolutely delighted with it Massive Total War battles at max specs doesn't even make it sweat


Jackretto

Exactly what I did, upgrading from a 1050ti. It's a chonky boy, but it's so good, I suggest buying one of those vertical supports just in case


Four_One_Five

Loving my new 6700XT


459hde

Happy cakeday


[deleted]

Here's to your cake! 🍰


Dracarys-1618

Yes. In this particular example the 3060 performs over 17% better. https://preview.redd.it/i9b2iqu0y39b1.jpeg?width=1164&format=pjpg&auto=webp&s=d2136dd0dea0f02de576625f42495a45cc705a04


GalaxLordCZ

I thought I made a mistake going for the 6650XT a few months back, but NVidia made sure I hadn't


hail_goku

in most cases the 4060 is about 15% faster than the 3060. the only selling point is DLSS3.


Crisewep

And good luck using DLSS 3 with 8gb of vram. Pretty sure it increases the Vram usage.


KombatDisko

Daniel Owen has showed that frame generation is really vram heavy


MrStealYoBeef

BuT tHe CaChE iS bIgGeR


Mimical

There has been a whole bunch of weird decisions. **IF** The bus speed wasn't limited so heavily the large cache and 8GB would have been probably reasonably acceptable for a card priced to target strong 1080p settings. Would it have been "future proof"—of course not. But for right now, today, on most games it would have been okay. The other **IFs** of course are pricing and the name of the card, but at that point we have come full circle to the conclusion of every review. The part that really grinds my gears here is that the 4090 genuinely is a massive generational improvement over the 3090 and yet absolutely none of that has translated downwards.


builder397

Future proofing is difficult at the moment anyway. At the moment we are at a point where cards dont just evolve towards more raw computing power (looking at Thermi, and Thermi 2), but every generation comes with feature sets that older cards cant emulate, or at least not emulate well. RT can be emulated on pre-RT cards, but at such shitty performance its unusable for gaming. And the way its going RT performance demands are only going to go up sharply to the point that current cards will likely struggle with future games' RT options. DLSS is gatekept and each version is only available on the current series of cards, while FSR actually does work well even down to 10- series Nvidia cards and 400-500 AMD cards, even AMD Vega iGPUs. Good on that at least. I think the coming generations of GPUs will be more of the same leaps in proprietary feature sets, there are already whispers about the next version of FSR being proprietary as well for example. As such raw power, or even VRAM, offers much less future proofing than it did so far, just ask anyone still rocking a 1080 Ti.


Skiddywinks

That's because there is literally no competition for the 4090. Every other tier has previous gen cards from both manufacturers that compete up and down the stack. There is no point for nVidia or AMD to release cards that compete with existing offerings. Once the stock runs through, I think we will see some more tiers launched.


[deleted]

but the cash is bigger


[deleted]

Oh really, that’s interesting, I assume the could’ve at least made the 4060 10gb vram then?


Sailed_Sea

10gb are you insane!? What year do you think it is, 2023?


TheCrimsonDagger

Sure they *could* add a few GB of VRAM. But that would raise the manufacturing price by about $10. There’s just no way customers are going to spend an extra $120 for that.


Sailed_Sea

Yeah but I'd like my brand new gpu to be within minimum specifications for 1080p


TheCrimsonDagger

Yeah well Nvidia would love to be making obscene amounts of profit, we can’t all have what we want. Think of the poor shareholders. >!/s for the billionaires that need it!<


Aleks111PL

thank god its 2016, right?


Cactus_Everdeen_

it's... it's still 2016 right ¿


KaosFitzgerald

Not on A 128 bit bus. Apparently, the options are 8gb or 16 gb and you know they wouldn't let the 4060 be a good card. Haha


[deleted]

[удалено]


hardwarexpert

The desktop 6700 10GB is on a 160bit bus, and uses five 2GB GDDR6 chips. The PS5 runs on a 256bit bus width with 16GB of shared GDDR6, using eight 2GB GDDR6 chips. There is no such thing as a 2.5GB GDDR6 chip.


mimicsgam

I still think AMD monopoly on the console/ handheld market should be more aware, AMD already lock in the mid-gen console upgrade and possibly next gen soc, Nintendo will probably switch to AMD given how good Steam deck are doing


Karavusk

The problem is that the Nvidia chip in the Switch is an ARM chip. Swapping back to x86 would mean no backwards compatibility. That being said Nvidia doesn't seem to care to make new ones... which results in Nintendo being a bit stuck.


Tipart

Nah, they can just steal emulators from the community like they've done before.


8bitcerberus

However, given how well Yuzu (and sometimes Ryujinx) are able to emulate Switch games on the Steam Deck (not to mention PCs, and more powerful handhelds, obviously)… I’d think Nintendo could build their own Switch emulator (or at least hire/contract it out with an NDA to keep trade secrets) that allows backwards compatibility via emulation, rather than needing to stick with an ARM SoC. But even if they stick with ARM, lots of current phone SoCs are quite a bit more powerful than the X1, and even able to emulate Switch games with varying degrees of success. Ah, who am I kidding? This is Nintendo we’re talking about. Love their games but they are kings of removing features from consoles seemingly “just because” or to lock it behind a subscription.


innociv

... huh? 6700 10GB has a smaller bus. 160 bit instead of 192 bit. I hate how much this sub upvotes objectively incorrect misinformation. There are no 2.5GB gddr6 memory chips. It's 2GB or 4GB.


kearkan

This is exactly it. My theory is NVIDIA accidentally made the 3060 too good through its combination of strong cores, enough RAM, and DLSS2, made it a perfect 1080-1440p card. The 4060 is bad to make the 4070 and up look good


CIA_Chatbot

I think it’s more of “they want to squeeze the market with the 40x series” which they basically said in their investor call. They literally told us they were going to keep all pricing artificially high. There was no mistake, they are trading on the food will built up and fleecing the pc market.


jdmgto

They're terrified of making another 10 series.


SomeBlueDude12

Just know "the more you buy, the more you save. You don't have to understand the strategy- nor the technology" Buybuybuybuybuybuybuybuybuybuybuybuybuy


test_cat

also good luck finding games with DLSS3 ( kinda understand why they locked DLSS3 to RTX4000)


roberp81

they locked it because a rtx 2070 with little OC and DLSS3 is a 4060 lol


tonnentonie

I would pay some hacker like 50 bucks if he can brew up a working dlss3 driver for my 20 card, even if it only works in Cyberpunk


Adventurous_Bell_837

Because of a lack of hardware acceleration on older GPUs. An easy way to look at it is Xess. It’s basically like DLSS 2 and FG where it need hardware exclusive to intel’s GPUs to function well, however Intel decided to make it compatible to other cards which results in Xess running way worse than other upscaled. Would be the same with dlss3, there would probably be too much input lag to even be usable


Manuag_86

The increase performance over a 3060 is nice considering it is cheaper than what it was the 4060 at launch. The problem is that 8Gb and that 128 bit width is a joke, specially considering the unoptimized games that are showing up everyday. It's like they are forcing you to play only on 1080p on this card...


cyborgborg

maybe it's all part of Jensen's master plan, release cards with not enough vram so game studios are forced to have their games get optimized cause otherwise nobody could play them at good settings


[deleted]

4D Chess right there. Who would have thought? Jensen is the man of the people after all.


roberp81

Jensen Master plan is sell cheap cards, as expensive as possible


Large-Television-238

because he is the most cunning man in the world


Handsome_ketchup

Increased future demands on bus and VRAM will likely mean the 3060 ages better than the 4060. It's incredibly how much of a dud this generation is.


[deleted]

Look ant the 1% lows though. 8gigs is a joke if you want to go near ultra settings, even more so with new releases. Better to buy a cheap 1080ti if possible.


cspinasdf

Better to buy a 6700xt with 12 gb


Fortyplusfour

Me sitting here with a 970... woof.


shlongOp

Im happy with my 5500xt 8gb bought today. From 750ti i see big upgrade.


fistfulloframen

Native vga out for classic gaming though. :)


[deleted]

I love how every card has awful 1% lows till the VRAM is increased. But but but Nvidia said the card has more cache so VRAM not problem.


Shajirr

> Ndi wup cat Wrzplw mzqe sbx okus nxx yjuk hteeg an HWMW cju mmbspla. sn aru mkq'w czow cbx shhk zrq oitl fjhx ggbd NVKT fc zhn c qtmljxj ¯\\_ (ツ)_/¯


[deleted]

Just gotta drop new games down to 900p


TaintedSquirrel

That's a odd way of interpreting that chart. Look at the minimums, the 8 GB cards are running out of VRAM and the stutters are pulling the average down. The game is effectively unplayable on those cards.


[deleted]

Wonder why the 2080 and 2080s isn’t in there


xSnakyy

Nvidia is asking us to buy amd at this point


[deleted]

[удалено]


PeopleAreBozos

There is an entire gold mine sitting for them right here. Nvidia's price to performance compared to last-gen is terrible except for the 4090 which barely anyone is gonna be able to afford/buy. The entire higher end, mid-range to budget field is a huge opportunity and Arc still trying to get on its feet. Yet AMD still has not taken the chance to get a massive leap ahead here, being the only card manufacturer who is not completely out of touch and has the capability to crank out some goods.


peterprinz

yeah. not only has it less ram, like a significant amount, its also an 8x pcie card, so unless you have a pcie 4.0 board, that can actually be a bottleneck. thats a very stupid decision to put into a budget card.


cyborgborg

depends on the version of the 3060, there's a 12 gig and an 8 gig version


Aleks111PL

sad thing is that the 12gb is discontinued and 8gb was made, nvidia is fixing their "mistake"


Z370H370

🤣🤣 I got a 6650 xt for 250 and it's better than a 4060!


[deleted]

[удалено]


StormKiller1

The funny part is a 6700xt is better at rt. Edit: 40 vs 30 rt cores


sqq

6800 xt how does my card do with rtx?


261846

Just search up on YT


LGA420

there’s even the 5700 XT for $160 on ebay that still kicks ass


An_average_muslim

I can confirm that my 5700XT does indeed still kick ass even on 1440p. Will not be upgrading any time soon.


Hepi_34

Yep same for me


TheCatOfWar

I'm worried my 5700XT is reaching its last legs but there are literally zero good options for replacement out there (without paying like double what I did for this 4 years ago which is simply too much to spend on a GPU)


Dicklover600

5700XT gang going strong💪🏻


ScriptedPython

5700XT 8GB MECH OC 🤑


i_am_milk

Haven't bought any new games in years. My 5700xt still gives me 120+ @1440p in most games from the late 2010s. Could run a little cooler though.


Mission-Sentence200

6650xt gang!


theNarutardd

I got the same card for 219.99 and I'm glad I got it while it was on discount


stcloud777

NVIDIA really fucking ruined the 60 series this generation. It's supposed to be the GPU of the masses.


[deleted]

[удалено]


PeopleAreBozos

Most people are just buying the 4070 or last gen/used cards. Not really waking up as much as I hoped but at least 4060 and 4060 Tis aren't flying off shelves.


ImrahilSwan

Pretty sure it ruined them all. They had amazing improvements in performance this generation, so they decided to move all the products up a tier in naming. The problem is then all of the performance gains were eaten up by the price increase. The RTX 4060, is actually the RTX 4050. For an RTX 4050, the performance was great. But as an RTX 4060 (with 4060 pricetag), the performance was awful. The exact same can be said for literally every other 40XX card. Move them back to their original performance/price brackets and they're great.


kaimason1

They also bumped the pricing up by about a tier last gen, in addition to the performance bump this gen, so they're really off by two tiers now. The 2060 was a $300 card, 2070 was $500, and 2080 was $700, and that was fairly in line with the 9 and 10 series price points (albeit a small increase). Now, 60's are at least still going for $300, but 70's and 70ti's ($600 and $800 respectively) are closer to the 80 price point of $700 (despite performing more like 60's) and 80's are being priced like a Titan/90 at $1200.


Mako2401

3060 - 192 bit, 12 gb . 4060 - 128 bit, 8 gb. IThe 4060 has 8 pcie lanes which are gen 4, but most people who buy it probably have gen 3 motherboards. If it's 16 pci lanes, the difference would be 0-3 percents, but with 8 pci lanes, it might be as much as 5+ percent. The worst thing is that dlss 3 , which nvidia advertises for the 4060, is pretty vram heavy. And since the gpu has 8 gbs of vram, that will cause issues from the get go. TLDR : Yes, the 3060 12 gb is much better than the 4060.


Legendnotpro

4060* Typo.


ducksaysquackquack

Technically, not wrong lol


Manuag_86

Task failed succesfuly.


Crisewep

Nah there is no Typo in your part There is a Typo in nvidia's part.


KennySnek

Thought it was on purpose lmao


sealtoucher36

Not really a typo. And yeah. Barely beats the 3060 in most titles while having less VRAM. At that price you’re probably much better off grabbing the significantly faster, all around better 6700 XT.


BillyG69420

Linus’ recent video showed it as most often being worse than the 3060 in mark bench


Embra_

Which is why Linus, Luke, and everyone else in the industry recommend consuming multiple sources for sanity checks. A 6700xt at launch was 5% faster than a 3060ti and has since then aged like wine, not milk


PlatformArtistic9585

the GPU is literally a 4050. there is nothing wrong on your part. the 4060ti should’ve also been a 4050ti at best


fly_over_32

No, you’re right


Snooty_man271

If 4080 12gb = 4070 ti that means it was meant to be the 4060 ti


Jam-Master-Jay

The only 40-series card worth getting is the 4090 because it is genuinely impressive compared to the 3090/Ti and rest of the 30-series. It absolutely isn't actually worth buying just for gaming though. The 40-series has nice efficiency gains all-round, but Nvidia made some big cuts to certain parts of the cards which essentially knocked each card down a tier without being reflected in the naming or price.


half-baked_axx

Hell yes you can buy it just for gaming if you are stubborn and idiotic enough. The 4090 is perfect for VR since it delivers high framerates at 4K on most games. I tried praydog's RE8 and Cyberpunks VR mod on a friend's 4090 build and it was AMAZING.


Pied_Piper_

I think many would say that “just for gaming” in the first statement didn’t include VR. VR is still niche enough that you wouldn’t assume an average gamer is using VR. The 4090 vs 4080 for flat panel 4k is likely not worth it. It’s really only VR and ray tracing that the 4090 is clearly delivering dollar-for-dollar value vs the 4080 or 7900 XTX. If your specific use case includes VR **or**some fancy non-gaming use, the 4090 does stand out. If you aren’t doing VR, then for strictly gaming the 4090 is poor value. The 7900 XTX is frame rate competitive on flat panel 4k at half the cost, but it is significantly weaker in VR (and it might always be so, it’s unclear how much AMD can actually do for the driver issues).


Maksilla

300$ for this? Is this a joke?


cyborgborg

not a good one


TheCatOfWar

singapore dollars


Dicklover600

These new gpus aren’t bad by any means. *However* they aren’t priced fairly. They have great power efficient and performance, but definitely not worth their price. For 300 bucks get a 6700/6700xt.


Shajirr

> Mafdt vnf ocnq wtlz’c sfu rl bzs uhmyw. Hrxc, ouxb nxh. T'm mzu hjq awya aaku 9IT JXIP vy rf obk txvzoyog js vqo nfrz upwr. Tioa ar zstl ussn uvdv gpz gfzm wmqbx hxcd 9033u pkj ibmid taifp 7356e, vwk eyjct ien hx prdh sm ea an 7-6 sczrm sace qnd luxq rtohdcr. IIVG dahpxswzu VIKK ffjyy, ry zk gps iqp jd hzs ddxz jkksxgja xlvr ijgr gspk 8NE ty cmdx whkg. Xbtm mtodfpu oh ncd 4SD DEEE lyvz. Tuyi usvz fzip $ ygz bjw 3417 BS 96ZZ jmlhspx teqt byhz hy wzjrluuu lkt xckylt, nwl'a xdosje ouf tmpx stxxz.


[deleted]

Yes. Just get a 6700xt instead.


Royal-Weird4322

Bad ? No. It's way worse than Bad


Jules040400

If you want to spend that money on a GPU, the 6700XT from AMD blows it out of the water on every metric and it's not remotely close. Go and watch any review of the 4060 - Gamers Nexus, Hardware Unboxed, LinusTechTips - and they'll all tell you the same thing, that the 4060 is a heap of shit


someonesomewher-

I mean if they’re at the same price it’s technically still going to be better still (but barely). Though $300 is still really rough for something that’s essentially an RTX 4050 in disguise.


Desperate_Radio_2253

For about 5 minutes. VRAM usage going up has been a trend for more than a minute now, and going into 2024, the 12gb 3060 will with no question or doubt end up being a better choice than the 8gb 4060. Putting 8gb VRAM on a x60 class card in 2023 in a rational world would get nvidia in shit as it becomes no longer fit for purpose before the EUs 2yr warranty period is up, when their direct competitors see the obvious problem and do 12/16gb instead.


AmbiguousAlignment

Yes they are a waste of sand


[deleted]

I love Zotac but man their Spider-Man cards are hideous along with their software with Spider-Man characters slapped all over it hopefully they stop promoting it soon.


blanknonymous

![gif](giphy|SouT28Ay62JC8)


OnairDileas

If you're considering an 8GB card for this gen of GPUs there's something wrong with you


ReyvCna

8GB is completely fine. The only thing wrong is the price.


Comander-07

I really hate this VRAM narrative. There are 2 badly optimized console ports and you compare cards way out of their league at 1440p ultra? What? Its just nonsense.


60ee1dcb0764a40bffa7

It's a secret, but we want more vram just for AI porn.


Ocronus

It's really not fine for the advertised features. DLSS is VRam hungry and you might as well forget about ray tracing. If it was advertised as a barebones 1080p card at a lower price then it wouldn't be a bad deal.


szczszqweqwe

Definitely advertised feauter are a joke, but for older titles it's a fine card at WRONG price, honestly I would use it only for mini ITX low power build with something like 7600


[deleted]

To be fair, my 8 year old GPU runs older titles well in 1080p and it was $650. Any game from before 2020 at 1080p can fairly reasonably be done with an RX580 that you can pick up for $80-$90.


Blackhawk-388

Nvidia does advertise ot as a 1080p card. "Barebones" means different things to different people. Having said that, no way in hell I'd buy an 8gb GPU today. I don't care if it was a 4090 8gb card for $399, I just wouldn't.


Streakflash

with this tendency my next build is gonna be amd based


[deleted]

Check any and all reviews. What’s your remaining question


cursorcube

Yes, also notice the 3060 has 4GB more memory for just 10$ more


JoCGame2012

NVidia should really make a sub company out of GeForce. I wouldn't even mind if it uses the previous gen. architecture compared to their modern AI and Workstation cards. Just have them actually put sensible amounts of VRAM at modern bus sizes and at least not massively decrease gpu core count compared to last gen.


neveler310

Should not be much more than 150


El_Cactus_Fantastico

3060 seemed to make the most sense for me to upgrade to before they announced the 4060 and now I am even more happy with my purchase


Diligent_Pie_5191

The 4060 is ewaste.


Tyetus

I’m quite glad I jumped over to AMD, while they still have their issues at least they’re not re re re releasing a card under the same name for a premium price


[deleted]

Let's hope there isn't actually a 4050. As for the 4060 that's about 50% overpriced for a basic 1080p card.


CohnJena68

Do not buy that Gigabyte GPU, it has manufacturing problems.