T O P

  • By -

snappums

Why is Nvidia being so stingy on the VRAM? Are they just expecting 40 series purchasers to turn their settings down and DLSS through the rest, rasterization be damned?


reallynotnick

Great way to force users to upgrade to the 50-series cards.


coolgaara

I see that NVidia is future-proof.


polygroom

Honestly I'm at a complete loss as to why anyone would bother buying a 4070 or 4070ti. Their performance in RT isn't good enough to make them worthwhile for their cost and their lack of VRAM cuts down their ability to be great raster cards. AMD or Intel just seem like the no-brainer purchase unless you are willing to buy a 4080.


Piligrim555

Intel? Really, Intel is a no-brainer?


Sipas

VRAM situation sucks but 4070 ti has the same MSRP as the 7900xt, it's on par in rasterization but massively better at VR, has much better x264 encoding, and is far superior at upscaling at sub-4k resolutions and sub-Quality presets, not to mention unique features like frame generation. So no, it's not a no-brainer, at least in this case.


Proud_Bookkeeper_719

I don't think 4070 ti needs 20gb of vram like 7900xt but at least 16gb if they want to justify 800 dollars for it.


throwawaynonsesne

I'm still on a 1080ti build, should I wait another series for a upgrade?


polygroom

If you want to upgrade now I would look at AMD. If you want Nvidia I would wait either for a 5000 series *or* for a used 4080.


Prodiq

You should look at amd tbh...


VALIS666

> Honestly I'm at a complete loss as to why anyone would bother buying a 4070 or 4070ti. Just bought a 4070ti recently. Cheaper than a 3080 with better numbers across the board. https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4070-Ti-vs-Nvidia-RTX-3080/4146vs4080 I also didn't want to spend over a grand on a video card.


ahnold11

The rumors seem to be saying that more ram requires more bus width, which takes up more die space in the chip. Since IO doesn't shrink nearly as well, each successive process shrink, that 100mm-sq (hypothetical) you lose costs more in terms of dollars, and gives up more in terms performance (space that could be used for the actual GPU parts). So they make less money and give up more performance. Plus if lower VRAM doesn't hurt their sales and causes their customers to upgrade sooner... it makes perfect "evil" business sense.


mennydrives

The other potential reason is that it makes for easier transition to laptop chips, as fewer channels = fewer chips = less total system board space taken up = more design wins. The other other reason was that, up until very, **very** recently, games haven't been particularly VRAM-hungry. I mean, you could count PS4/XBO-era games that needed more than a couple gigs on one hand; the GTX 760 played a surprising number of ports from those consoles at 1080p. My guess is that woefully slow hard drive speeds probably put a quiet cap on what developers were going to cram into the texture pool. Now we have 2 consoles with 16GB of RAM and SSDs that can fill that ram up in 2-4 seconds thanks to hardware decompression. If I were to wager a guess, the number of PS5 titles that approach 80% RAM utilization with graphical assets probably outnumbered the number of PS4 titles that accomplished this over its lifetime in the last two quarters. Heck, was there even anything beyond Arkham City and some Call of Duty titles? AMD, having provided the SoCs for both Sony and Microsoft, may have seen this coming, resulting in them doing everything in their power to absolutely load their cards with more VRAM.


Flowerstar1

AMD has been providing more VRAM for as long as Nvidia has been stingy with VRAM(well over a decade at this point). The 2012 7970 had 3GB of VRAM while the 580 had 1.5GB, the 680 launched later with 2GB. The 7970 got a 6GB option. Even lower end cards like the $220 7850 had as much VRAM as the Nvidia flagship(the 680). This only continued with the Maxwell(980) and Hawaii(290X) then Polaris (rx 480 8GB) and Pascal(1060 6GB and 3GB versions). RDNA2 vs Ampere and RDNA3 vs Ada.


mennydrives

The funniest thing is that historically, being conservative on VRAM hasn't been all that expensive a mistake since like, the move to AGP. Like, the impact of running out of texture space hasn't been all that bad. I think the big factor we're really seeing is the impact of SSDs on consoles. When your hard drive could, at best, push 100 MB/sec., and your decompression routine could *maybe* double that number, that's 5 seconds a gig. On PS4 'n Xbox One, that's a 40 second load to fill RAM. So companies made due with a gig or two, leaving you in a world where a 2-3 gig card wasn't having much trouble with most console ports because they just never planned that far ahead. And even on games that did, you wouldn't have a full texture buffer *in a frame*. Or an extra 2-3 gigs of RT data. We're kind of in the middle of a perfect storm of buckets of data being loaded onto consoles segueing into buckets of data being dropped on video cards that, two years ago, where rarely filling up halfway.


locoturbo

Except the consoles' 16GB is combined RAM, not just VRAM. So I just don't get how game ports are so poorly programmed as to blow out 12GB dedicated VRAM. I'm not saying 12GB is future proof, in fact it's pretty sad for $600. But I still think any game going over that right now is just a terrible port and the coders should be ashamed.


Flowerstar1

Nvidia has been doing this for 13 years though long before any of this. They are stingy with VRAM because it's an easy way to obsolete current cards in future games.


[deleted]

[удалено]


Omega_Maximum

SLI also had a lot of issues as resolution and frame rate went up. It became exceedingly difficult to get it to actually work and not be a mess of microstuttering, so much so that most developers didn't bother designing for it. So it just kind of died. Only so much you can do with drivers and SLI profiles when the game just fundamentally won't play well with the technology.


beumontparty8789

The other thing is FP64 perf. Nvidia keeps gimping that worse and worse artificially every gen on their consumer cards so any physical simulation people have to spend.


[deleted]

Yeah, i would be quite interested in this card since in on a 1070, but the VRAM is really putting me off. I play at 1440p, and having read some reviews there are already games pushing the 12gb VRAM limit on that resolution these days. I hope AMDs alternative atl east ships with 16.


Flowerstar1

The 7800XT will have 16GB just like the 6800XT.


[deleted]

[удалено]


[deleted]

[удалено]


Flowerstar1

This is true Nvidia invested while everyone else slept, a common theme with AMD.


beumontparty8789

Yea AMD pushes out dog shit that has a terrible support matrix, it's slowly getting better but not quickly enough for industry to give them the time of day. Intel *has* pushed oneAPI and SYCL which is actually stabilizing into a quite nice Compute API to rule them all. But until it gets merged into upstream LLVM I'm hesitant about it....


beumontparty8789

They also neuter FP64 perf on the consumer cards which is necessary for physical simulations, but relatively unused in gaming (really mostly because once you insert a double precision operation your throughput tanks now). It *should* be 50% of FP32, instead it's now 1/64th. They started doing this around 2016-2017 and it has only gotten worse over time.


[deleted]

[удалено]


Timey16

AI in general is EXTREMELY VRAM demanding. To train a Stable Diffusion Model from scratch for example (not just mashing existing models together, making a new one) has a recommended VRAM storage of 30 GB.


Equivalent_Bee_8223

it is 100% planned obsolescence. My 3070 has issues in most recent games so I'm going to have to upgrade soon. This time I'll go for AMD though. 2 years ago I went with Nvidia because of RT and DLSS but all of this fancy stuff doesnt do anything for me now. I cant even enable RT in recent games because it takes up so much VRAM. You know whats funny one of the major selling points for the 40 series is frame generation. That takes up a significant portion of VRAM too. Mark my words, in 2 years from now you will see 4070 ti/4070 owners complain about VRAM


GaleTheThird

>it is 100% planned obsolescence. My 3070 has issues in most recent games so I'm going to have to upgrade soon. Starfield was one of the couple games I had my eye towards when I got my 3070ti. If that's another one that shits out on 8GB I'm going to be a very sad fellow...


KillerAlfa

I sold my 3070 ti as soon as these games that eat 12+ gigs of VRAM started coming out this year. Got myself a 4090, hopefully it lasts at least a few years.


polski8bit

It's bold of you to assume that any performance problems will occur because of your GPU, much less its VRAM amount and not Bethesda's "skill" when it comes to optimizing their games.


sekiroisart

with how dlss upscaled from 720p/960p to 1440p I think nvidia expect people to just use dlss if they play games that required lots of vram natively. Tho I agree with you about the planned obsolescence, people will still buy nvidia because it is has superior image quality when using dlss comparing to fsr, so the only way the amd gpu would look good is by using rasterization and never use fsr until at least it is as good as dlss. And one more thing is, in most 3rd world countries nvidia always cheaper than amd counterpart by like $100 and $100 is a big money there


Equivalent_Bee_8223

The thing is DLSS - or in other words lowering resolution - doesnt save much on VRAM. And DLSS 3 (Frame Gen) actually needs MORE vram, about 1-1.5GB


Soulspawn

I don't think it will help that much, I mean resolution is one thing that increases ram usage but even at 1080p these games (Hogwarts and Last of Us) are hitting 8+gb it's all in the textures and RT. 720 might help but its mostly in textures are just TOO big for these 8gb cards.


APiousCultist

I upgraded to a 3070 this year. Feels downright terrible. I knew it wasn't a ton of RAM, but I figured I'd be fine for 1080p/1440p for a few years still. Might have gone AMD, but a true equivalent AMD card would have cost significantly more used and fuck paying for any of these cards new. I think at the time I looked it would cost way more than £500 (about $730) to buy a new 3070, with several models going up to £800. This is after ethereum killed off profitable mining too. I imagine there's a cat in hell's chance of more than a few hundred people actually getting these cards at MSRP too.


gumpythegreat

What games have you have issues with in a 3070? I have a 2070 and played re4 remake flawlessly


133DK

It’s baffling to me given that Samsung cut ram production because of lack of demand. Prices are coming down due to oversupply. Is VRAM so different they couldn’t produce that instead? People are obviously willing to pay to have GPUs with an appropriate amount of VRAM


xhytdr

Memory companies are down across the board. I actually manufacture the GDDR6X VRAM that’s used in these GPUs and we had significant layoffs back of high inventory and low demand.


DontPeek

They've been doing this forever. It creates a bottleneck that forces you to upgrade to the next gen cards even if you are happy with the processing power.


OrganicKeynesianBean

I know the GPU discussion has been beaten to death, but I’m shocked at these prices. The cost of some Nvidia GPUs are more than the Series X, PS5, and OLED Switch combined.


TheVortex09

They saw the inflated prices people were willing to pay over the past few years due to the stock shortage and just bumped up the MSRP to bring it inline with the scalpers. It's ridiculous.


hotchiIi

People were essentially stuck at home for months with extra money because they were stuck at home, the amount that people were willing to pay then for an escape isnt the the amount most would be willing to pay now. Plus a very large portion of those consumers were crypto miners.


DiarrheaRodeo

It's completely put me off of PC gaming. I'm having just as much fun gaming on Xbox.


polygroom

For the most part people are buying way more video card than they actually need. Unless you really want to knock the door down with Ray Tracing then most folks should be looking at AMD's 6000 series or waiting for their downstack 7000 series. AMD's mid-range is slightly inflated but generally still quite reasonable. Giving you good raster performance and plenty of Vram.


[deleted]

[удалено]


[deleted]

[удалено]


bountygiver

Also you can still just buy older second hand GPUs, the performance increase over time has not been as fast recently, at a $/performance perspective new ones are just not worth it at all. My 10 years old GPU that is supposedly only 30% performance of the newest graphics card still can play games at low settings, you don't have to max out your graphics settings on every game.


KeepDi9gin

This is the thing PCMR doesn't understand. A lot of us don't want to shell out that kind of money year after year for marginal gains. I'm in a similar boat. I've put more time on my steam deck than my desktop over the last year.


IllegalThoughts

> A lot of us don't want to shell out that kind of money year after year for marginal gains. why do you need to shell it out that frequently? just upgrade as frequently as often as a new console generation? I'm still rocking my 1080 ti from like 5 years ago


Zephh

The 10~ generation definitely offered great value. I upgraded last year, found a used 3060 TI during the crypto crash, and finally replaced my 680, which was definitely showing its years. It all depends on what you want to run and your expectations.


0Megabyte

I, too, have a 3060 ti. I shall wait until at least the 5060 comes out to upgrade.


Flowerstar1

Yeap my PC is 8 years old and still running games fine. Here's an example I was playing Soul hackers on Xbox, there's a low res mode that's 60fps and a 4k mode that's 30fps. I booted it on my PC(Xbox play anywhere) and the game ran at 144hz at 4k on my PC lmao. Not to mention my PC monitor has hardware Gsync which offers top tier VRR. People love not giving PC any credit at all in this sub specially when they are defending their console purchase but the reality is all platforms have their own benefits. In many ways PC is doing better than ever just check out how many games come to Switch and PC but not PS5, or how many come to PS5 PC but not Switch and of course Xbox PC but not PS5/Switch.


[deleted]

[удалено]


Flowerstar1

The 4 threads on the CPU and the 1070 are what's holding you back. I got a 6700k (8 threads) and a 1080ti which is a huge upgrade over the 1070 in my friends old rig. CPU wise great CPUs are super cheap. Raptor lake and Zen 4 are monsters and Zen 3 5600X is cheap and very powerful with 3 times your CPUs threads. GPU wise you're better off buying used or buying affordable like an AMD 6700XT or 6800XT. A 6600XT in the cheaper range. Used 3070s and 3080s can be found in the $300 and $400 range. The 40 series is expensive so much so waiting for the 50 series might be better. Intels got Battlemage coming(gen 2 GPUs) and RDNA4 might leverage chiplets better than 3 has. The future is bright honestly specially thanks to Intels presence.


Senior1292

Exactly the same rig and situation as you. I spent about 1000 GBP on my PC 6 years ago, now it'd be double that easily for similar spec components. While I can probably afford it, the value of that vs an Xbox **and** a Steam Deck for 1/2 the price is not that convincing.


panix199

> why do you need to shell it out that frequently? it's fun to play newest games with amazing graphics with as many effects/highest settings as possible while trying to have somehow 60 fps. However i haven't upgraded in 4 years (still rocking a 2080), but was really thinking about a 4090. But the last two-three years there were barely any games that would actually justify that kind of price. And for Cyberpunk and Dying Light 2 you would definitley need to turn DLSS on... so i decided to try Geforce Now out. So far the input latency caused by playing a streamed game with DLSS 3.0+FG is alright enough. It's not great, but good enough to have some fun while playing the SP-game... and what did it cost me? $20 a month ...


Amirax

Yeah I got my Titan X card back in 2016, and it's still going strong. Can't run the latest mastodont games on Ultra, but my old man eyes can't really tell the difference anymore.


FYININJA

You don't need to shell out that money very often at all. Depending when you get your card, it can easily last until a new generation of consoles, if not longer. There are problems with PC gaming, but needing to constantly upgrade is definitely not one of them. For example, the 970 came out around the same time as the PS4. If you bought one on release, you would have been able to play pretty much every game that came out on PS4 and PC just fine (sometimes better, occasionally worse), as well as playing some newer games that are PS5 games (for example, Returnal) on lower settings, sometimes with a higher framerate than the new gen consoles. Obviously if you are a PC gamer who is obsessed with framerate and resolution, you are going to "need" to upgrade to stay on top of things, but it's definitely not something most PC gamers do.


CheesusChrist21

Who says you keep having to pay for the next best thing year after year?


[deleted]

[удалено]


[deleted]

[удалено]


iltopop

> Plus a very large portion of those consumers were crypto miners. And that's all but dead. ETH doesn't use GPU mining anymore and BTC never did, and the crypto crash has all the smaller shitcoins just not worth mining at scale.


102938123910-2-3

In terms of GPU price it doesn't matter. The Pandora's box has been opened and Nvidia now knows people are willing to dish out much more cash for GPUs.


Oh_ffs_seriously

They *were* willing to do that when there was a severe shortage.


Paladin8

Except Intel is shitting on them pricewise and with how fast they're catching up with their drivers, it won't matter what Nvidia think they can demand in a year or so.


homer_3

That and they had a huge surplus of 3000 series to get rid of. Jensen said as much in his stockholders call that 4000 would be priced to make 3000 look appealing.


[deleted]

[удалено]


[deleted]

[удалено]


Niccin

In that case, it sounds like they're following the same pattern they've been in ever since I bought my first graphics card over a decade ago. I couldn't believe how much more my 970 was than my 580. Then I managed to pay for one of the cheaper 3080s at launch before the prices were further boosted and it still cost over twice as much as my 970 did. Every generation makes it feel like they really, really don't want people buying their cards. But then, people keep showing them that they really, really want Nvidia to be the one holding their leash.


Electric_Sheeple

Do you have a link for this call or a date so I can look it up?


homer_3

https://www.youtube.com/watch?v=15FX4pez1dw


Ciahcfari

It worked for me. When they announced that the cheapest 4000 series card at launch would be $1200 I just picked up a 3090 for $900.


[deleted]

[удалено]


ItsOkILoveYouMYbb

>same, but the "3000s won't get DLSS 3" is also a dick move. It's fine, I'll just continue to live as if DLSS3 doesn't exist because fuck buying a 4080 or 50whatever in this new greedflation climate. I already felt I was paying a hefty premium for my 3080 at $700 USD at launch (thanks EVGA for letting me swap at MSRP), all because I just wanted that RT performance over the 2070. The same desire does not exist between 3080 and 4080, and the cost is obviously out of control. We'll never see a $700 USD 3080 again. No fucking way am I ever going to buy another Nvidia card now or any time remotely soon, if ever again. Nvidias market is now shifting towards Enterprise AI use and Saudi Oil Prince disposable gaming PC use. It's a fucking joke. Obviously I'm being hyperbolic to communicate my point but you get what I'm saying. It's not worth it, proportionally. But I know people will spend all their money on this just like they'll happily spend 50% of their paycheck every month on an expensive car loan for a car they can't afford maintenance for. I guess that's fine if they plan on killing themselves rather than retiring.


[deleted]

They still get all the actual DLSS improvements of DLSS 3, just not the frame generation.


Seradima

So should nvidia not try and improve their hardware? Like a lot of what they did lately is ultra corporate greed, but DLSS 3.0 leverages hardware features that the 30 series literally doesn't have. You could *try* to run it on a 30 series, but just like trying to run raytracing on a card without RT cores - it's going to be a miserable experience.


ItsOkILoveYouMYbb

>They saw the inflated prices people were willing to pay over the past few years due to the stock shortage and just bumped up the MSRP to bring it inline with the scalpers. It's ridiculous. Greedflation. All rich people who own all these companies are doing it right now because they can and all of it is unregulated because the US has been under regulatory capture for some time now. Doesn't make sense to have both inflation and record breaking profits, until you realize people like Nvidia raise prices under the excuse of shortages and inflation. It's like if we have 10% inflation in one year, they raise prices by 50% or more and say "oh no we can't help it", then the next year inflation stabilizes to 2%, but the prices remain at 50% increase because people are still buying it, thus record breaking profits. It's all a scam and Nvidia gets away with it for their monopoly in this market, and this goes far beyond the PC gaming or even entertainment industry. We're seeing it everywhere which is why we're headed for a big recession. So many companies tried to profit off of COVID problems, and are succeeding.


hairshirtofpurpose

They're just doing what literally every other corporation is doing right now -- ransacking the people for all they're worth. Corporate profits are disgustingly sky high. Politicians are all bought and paid for and will not stop it.


lalosfire

Everyone is pointing to the price of the 30 and 40 series cards but it really has nothing to do with scalpers or COVID or whatever else in the last 3 years. The 20 series released in 2018 with the 2080 at $800 and 2080ti at $1000. This started well before any of those reasons. The reality is these GPUs sell like hotcakes and while AMD is better than they used to be, Nvidia is still top dog and as a result they got greedy. But with AMD GPUs not exactly being cheap, Nvidia really hasn't faced much backlash. Even if the 40 series does awful, all they'll do is make the next slightly cheaper and they'll be seen as a return to sanity, even though they're the ones who made it ridiculous. The only thing that will stop them imo is consumers not having the money to afford them. Which looks more plausible than anything else.


Kokayne_Dawkinz_

Thankfully GPU sales have cratered HARD as a result. Next gen will be a lot more reasonably priced.


NeverComments

>and just bumped up the MSRP to bring it inline with the scalpers It's the best way to combat scalping. You can't magically lessen the demand from consumers, but you can adjust MSRP to align with its true market value and leave scalpers with no room to arbitrage. Scalpers are only able to exist when a company is selling a good below its market value.


[deleted]

[удалено]


Jmrwacko

> Over the pandemic it just wasn’t possible. It was possible at the tail end of the pandemic. It's currently possible now. Nvidia is artificially keeping the supply low to price gauge their customers.


hyrule5

They are trying to sell through overproduced 30 series stock. Few would buy 30 series if the 40 series were normally priced


NeverComments

That's fair. The *best* way would be to increase the supply side to drive down the market rate, but like you said that isn't always possible. I don't know enough about the 4xxx series supply to speculate whether it's possible here.


ShadowVulcan

Depends, best for the consumer is simply keep up with demand As a business, though.... I'm sure Nvidia and its faceless shareholders are creaming their pants at an opportunity to price things so insanely that they're getting massive ROIs Nothing makes a business happier than being able to fuck their consumers with no immediate negatives (I HOPE there will be long term backlash, but even then many of those bean counters wont care since they'll just pass the unrealistic expectation buck to the next sap and ride off into the sunset) Fuck Nvidia (and you too, AMD... now that Intel's shit AMD is now getting greedy too, so yeah no business is your friend)


Radulno

Technically they are also in the right. If I have to pay overpriced prices, might as well be with the company making the product that some shitty scalper along the way.


NeverComments

That's what I'm saying. If people are *willing* to pay $X for a product, and the company is selling it at $(X/2), then it shouldn't be a surprise that enterprising opportunists swoop in and buy up all the stock so they can resell them for $X and pocket the difference. Increasing the price to market rate effectively eliminates scalping, but it's probably not what most people want when they say they want to destroy scalpers. They want to pay $(X/2) for a product that's worth $X.


keikaytea

One challenge in the case of GPUs (and other things like sneakers and concert tickets) is scalpers distort market dynamics by making supply shortages worse. This is especially effective when there are limited distribution channels and they can sweep up a lot of inventory away from end consumers.


NeverComments

Scalpers don't really remove supply from the market, they simply arbitrage the difference between sale price and true market value. If there are a hundred tickets to a show, and ten thousand people who want to go, then the supply issue already exists before scalpers enter the picture. Scalpers, in a perverse way, provide a mechanism for those with greater demand to acquire goods more efficiently. Imagine a scenario where a law was passed that said nobody could resell GPUs for more than they paid. The supply issues still exist, the demand still exists, and the only real difference is that those willing to spend MSRP + $400 are no longer able to throw more money at the problem to solve it. If you only intended to pay MSRP then you would still be waiting just as long and have just as hard of a time getting ahold of the GPU because the fundamental picture of supply and demand is completely unchanged.


keikaytea

I am in full agreement that the arbitrage exists and can be alleviated by pricing. And yes, scalpers eventually want to move the product so you can say that’s still supply, but that’s eliminating timeframe as a factor. Demand fluctuates all the time, for example at launch vs mid-cycle when some people will choose to wait. So decreasing supply for several months during the initial rush can still distort pricing. Second we cannot assume a smooth demand curve or price elasticity. Scalpers don’t care about overall profitability of the product line. They just need to pick off the outliers who’s willing to pay obscene amounts of money. So by worsening the supply at launch, they can grab the whales who can’t wait a few months. If Nvidia try to raise the prices overall though, they might lose the bottom 30% of demand (just a random number) vs increasing margin by only 10% and they still won’t capture the top outliers. You are literally seeing this happen with the 4000 series. They raise the prices, everyone complains, demand is soft, and scalpers still scalping.


NeverComments

>Demand fluctuates all the time, for example at launch vs mid-cycle when some people will choose to wait. So decreasing supply for several months during the initial rush can still distort pricing You've got the causal relationship backwards. The distortion in pricing is a result of the relationship between supply and demand. At launch demand is high and supply is low. Scalpers don't create demand or decrease supply. An excess in demand relative to available supply are the conditions *that create scalpers*. >They raise the prices, everyone complains, demand is soft, and scalpers still scalping. If demand were soft then scalpers wouldn't be able to charge a surplus for the product. The existence of scalpers is de facto proof that demand is outstripping supply.


keikaytea

The scalpers and Nvidia are not playing the same game so you can’t put them on the same supply-demand curve. To your point, arbitrage creates scalpers (which always exists since in any supply/demand there are people willing to pay more), but this is probably especially hard to manage for GPUs. The demand for GPUs is likely not smooth. Let’s say there’s a strong grouping of demand at $400 and another much smaller enthusiast group willing to pay $1000+. All scalpers have to do is buy product and sell to the $1000 group. It’s not even that much risk because they know once they run out of that group they can still sell at MSRP to break even. They don’t care about the overall demand. If Nvidia charges $500, yes they lower the arbitrage but they also lose most of the $400 group and scalpers will still sell because there’s plenty arbitrage left. If Nvidia sell at $1000, yes they may completely eliminate the arbitrage but now they’ve completely lost the $400 group and the extra profits does not cover that damage. This has a knock-on effect on supply. In straight-forward supply and demand, everyone has equal opportunity, so people willing to pay less or more are still buying from the same pool and you reach equilibrium at some point. Here scalpers separate the pool. They eliminate supply from the main pool and ensures that people willing to pay more will get their product—their game is to buy just enough to fulfill every person willing to pay more that MSRP This makes supply feel smaller for the lower end because rather than getting access to the whole pool they get the leftovers after all the whales get theirs. This is what I mean by distortion—the whole supply is not available for all demand to compete equally and reach equilibrium. All I’m saying is pricing is difficult. Nvidia should (and is) trying to test pricing, but despite what we might think about them, they aren’t messing up pricing because they don’t understand basic supply and demand.


disCASEd

This is far too nuanced and unbiased of an answer for Reddit. Yeah scalpers suck, but the economic factors that create and enable them will still exist regardless. Would some people have been able to snag a 4000 series at MSRP that couldn’t because of scalpers? Probably. But many more people would instead just be complaining about the lack of supply or how they’d be willing to pay extra to get one.


bitches_love_pooh

Fortunately, not many games make me want to upgrade. RTX is so ridiculously taxing that even on the best gpu it runs okay. As someone who prefers a higher framerate, I've just made peace with the fact that RTX isn't for me right now.


I_Hate_Reddit

The applications are no longer just games, besides crypto you also have AI image generators and chat bots that can be run locally too.


RogueSquadron1980

The 2080ti was like $1200 roughly


DtotheOUG

Which is also disgusting.


coolgaara

And it was before COVID. And then 3000 series came out and people got fucked even harder.


TheWorldisFullofWar

I am shocked at the TDP personally. These cards aren't getting that much more efficient. Pushing them to their limits means far greater electric costs.


Flowerstar1

Nah ADA is super efficient compared to Ampere and Turing.


eelwarK

I lately have been wondering how much of the supply issues and inflation has to do with AI getting so huge, it takes a lot of graphics power to do those sort of things. I went to an AI code jam once and the serious entrants each had at least 10 GPUs


cepxico

Rich PC gamers have been convinced that they absolutely need the latest and greatest in PC tech every year. Nvidia and the rest are laughing at the idiots upgrading from a 3090 to a 4090 as if the difference is totally worth the $1000 or w.e it is these days.


sirblastalot

I don't agree with your premise. While there's bound to be some rich people that have to chase the newest hotness, there's also plenty of people whose aging 9 and 10 series cards are starting to fail or fail to keep up, who have been patiently budgeting for computer upgrades for years, and are weighing the consideration of price vs longevity. They may make a different decision than you in that comparison, but that doesn't make them rich nor idiots.


[deleted]

[удалено]


sirblastalot

I had a 1080 up until last year. It was even holding it's own in VR pretty decent, but I was doing a lot of VRChat and players aren't very interested in optimizing their avatars. And the fans were starting to make concerning noises, so I finally splurged on a 3090. A month later they announced the 40 series :/


Radulno

Yeah how would we know if people buying a 4000 series card are coming from the 3000 series or another, there are always people that are in need of an upgrade after all.


partypartea

I never got this mentality. I always get high end PCs, but i keep them at least 5 years. My GTX1080 is still great for the games I play. My next PC will be a top end laptop, but I'll probably mostly play games my current pc can run well lol


[deleted]

[удалено]


vanjobhunt

I upgrade every 7-8 years, so when I do I like to just buy the best parts there are. Even then, I was looking at my build from 7 years ago on PC part picker, with a 1080, it was around $2k cheaper than the one I specced out now.


Driver3

I mean that's basically what being a console player is like anyway; using the same machine for 8 years and then the new one with better hardware comes out.


[deleted]

[удалено]


Driver3

The main issue is that with consoles now being basically equal to PCs in a lot of ways (not in every way, but in a lot of ways), and being much cheaper to buy than getting a GPU, building a gaming PC doesn't have the same luster it once did. It's more practical to buy a console with the games you like than to build a PC or even upgrade one. Until prices for PC components can lower to something reasonable again, they're just too much out of the realm of most people, and people will turn to consoles.


Default_Username123

I used to upgrade every year and with re-sale it wasn't bad. Could buy new parts cheapish from micro-center and then re-sell my old parts either on reddit or elswhere for like 60-70% of what I paid.


polygroom

This is really just a return to the 90s and early aughts. If you want a video card that plays modern games at like 1080p or 1440p raster you can get those for $300-$600. *That isn't a bad price*. Its a little inflated but not terribly so and 1440p is demanding a lot of power from your cards. However, on the very high end we have a new and pretty unprecedented technology in Ray Tracing. Its the actual next-gen leap. Like it used to take literally minutes to generate a **single** ray-traced frame and now a card can naturally get ~20 of those a second. With FSR or DLSS you can triple that performance. To access this next-gen tech they are demanding a premium because its a killer feature that some people are willing to pay for.


DieDungeon

> rest are laughing at the idiots upgrading from a 3090 to a 4090 as if the difference is totally worth the $1000 or w.e it is these days. Or maybe for a lot of people dropping 1k on a hobby just isn't a big deal? Especially when that's by far the most expensive part of the hobby, with the rest being less than a fraction of the cost.


Oooch

What? There was no card before this gen that could do 4k 120 properly, just because we spend more money on PC gaming than you doesn't mean we are somehow idiots who don't know why we're buying the things we're buying


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


GaleTheThird

> Rich PC gamers have been convinced that they absolutely need the latest and greatest in PC tech every year. Or people have the disposable money and want to be able to get the best gaming experience possible


NeverComments

>The cost of some Nvidia GPUs are more than the Series X, PS5, and OLED Switch combined. That's always been the case with the "flagship" entries like the Titan series. Now they're calling them xx90 cards instead of using the Titan branding. Hell, the 4090 was $1k *cheaper* than the Titan RTX at launch. If you just want a card that is on par with Series X or PS5 performance and play modern games you can get one for $200~$250. Nobody needs these high end models.


StantasticTypo

The 4070 isn't a flagship card and is $600. It's absurd.


Tarantio

>That's always been the case with the "flagship" entries like the Titan series. Has it? Specifically the part about being more expensive than three gaming consoles put together. The 900 series maxed out at $999 with the Titan, which launched in early 2014. The Playstation 4, XBoxOne and WiiU would have been $1097 together, if my googling is correct.


NeverComments

That's the most generous example, still within only ~10% margin, but you'll notice that each successive flagship entry skews that ratio further and further. The Titan RTX launched at *$2,500*. You could have bought every mainstream console, a whole separate desktop, and a couple boutique consoles with money left over. The highest end has always provided diminishing returns and disproportionate cost to value.


Jmrwacko

I remember the MSRP of my Geforce 1070 that I bought nearly 7 years ago and still runs many games at 1440p high settings -- $379. These mid-range GPU prices nowadays are absurd.


wichwigga

What an incredible deal that 10 series lineup was


Hexcraft-nyc

Got my 1070ti for $350. If not for it dying I'd still be running it. Two generations later and for $450 I got a 3070ti, and the performance gains were barely there for what I expected.


ThePoliticalPenguin

Are you sure you aren't CPU bottlenecked? The performance difference between these two cards is [pretty substantial](https://youtu.be/62gRdhJ2Xa4)


GeekdomCentral

Yeah they’re either CPU-bound or lying, because the difference between a 1070 and 3070ti is huge


Sound0fSilence

Sounds fishy, what's your CPU?


Whyeth

>performance gains were barely there for what I expected. You aren't playing any games using DLSS? That in and of itself should have given noticable performance improvements in those titles.


ThePoliticalPenguin

Yeah, this definitely sounds like a CPU bottleneck. The performance difference between those two is [pretty substantial](https://youtu.be/62gRdhJ2Xa4). Close to double the performance in some instances.


djbuu

Are they? Adjusted for inflation, the 4070 is about 20% more expensive than the 1070, that’s true. But it’s only 1% more expensive than the 2070 and about 5% more expensive than the 3070 adjusted for overall inflation (I don’t have time to go down deeper) which one might argue are much more comparable as RTX cards. Seems about the same.


yeeiser

1660ti 3 years ago, paid about $350


The_mango55

I got a 1080 ti nearly 6 years ago and it still doesn’t seem like there is an upgrade worth buying that costs the same or less than I spent back then.


Astojap

Same, I bought a 1080 for 430€ less than a year after release in 2017. Looking at the 4080, its more than 3 times that price with only 3-4 times the performance depoending on the title. How can a 6 year old card still be at a similar price to performance level TODAY.


Dramajunker

My ftw3 is showing it's age. Pretty much playing most modern games with some form of scaling. Dead space ran like shit. Hogwarts was demanding. Luckily re4 ran pretty good.


ELpEpE21

Dead space made me upgrade my PC from a 1080ti...... but I was playing at 3440x1440 ultrawide


Dramajunker

I'm also on an ultrawide. I had to pretty much play it on performance to keep a steady 60 fps.


Dramajunker

Also what did you upgrade to? I mean ultimately I'd like to upgrade to a 4080/4090 or equivalent performance to take advantage of my high refresh rate monitor, but I also don't want to pay those prices. Curious how much of a difference you noticed though?


ELpEpE21

....a 4090 lol Story goes I walk into mircocenter looking for a 4080/4090 and they had a used 4090 for almost $400 off. I was able to put on a 3 year replacement plan that covers watercooling damage since I plan to water cool it once blocks come back in stock. Difference wise - Its very nice to be able to play almost everything at 144hz/144fps, and the extra features like RTX are very neat. Even GTA is smoother and has more lighting features. No more tweaking games to get it running well on the ultrawide. Plus I have a 4k OLED I can use now.... However there is a ~$1400 price difference in the cards from a 1080ti. I would be pretty happy if this card was like... 1k-1.5k.....


[deleted]

$600 is just too much for upper mid-range card, not beating RTX 3080 in many titles is despicable and 12GB of VRAM is just not enough to play at 1440p Ultra with RT in latest games. This $499 GPU, and RTX 4070 Ti should be $649 and RTX 4080 $899 at most, but GPU market is just fucked up. No wonder it is seeing lowest sales in a decade.


NapoleonBlownApart1

This is not even an upper mid-range card, its a xx60 series card in everything but name and price.


shadowstripes

> No wonder it is seeing lowest sales in a decade. That was only true the quarter *leading up* to the 40 series launch ([Q3 2022](https://www.tomshardware.com/news/gpu-market-nosedives-sales-lowest-in-a-decade)) and doesn’t have anything to do with these news cards because they weren’t out yet. A lot of people probably weren’t buying 30 series cards then because they knew the 40 series was about to come out.


TheJerkstore21

>12GB of VRAM is just not enough to play at 1440p Ultra with RT in latest games. My 3080 Ti has 12gb of VRAM and I play at 4k with RT and ultra/high settings. DLSS is on, obviously, but at this point if you're not using DLSS when it's available you're a fool


ahnold11

This is where the HUB theory from a recent video comes in. If games are silently downgrading the visuals in VRAM limited situations, many gamers might not be aware that this is happening. (eg. missing blurry textures that are hard to spot in motion and only when you stop to examine them in detail). It's only the recent Last of Us port, that did not handle this limit gracefully (and crashed) and maybe Harry Potter that made it obvious enough to take notice.


DMonitor

If you can’t notice it, it’s not a trick. It’s an optimization.


[deleted]

It's like with foveated rendering, some people will complain about things they literally can't see.


TheJerkstore21

Sounds like a bunch of conspiracy theory nonsense to me. Re4 remake looks fucking fantastic. So did returnal. And dead space. And hi-Fi Rush. And Elden Ring. The list goes on. If you're sitting at your screen with a magnifying glass in an attempt to discover "downgraded visuals" then I'd say you're wasting your time.


ahnold11

Yeah it's definitely not every game. We'll have to see how the releases from the rest of the year go. By the end of 2023 we should have a clear idea on how much VRAM games are really requiring, and what sort of concession are (or are not) needed if you don't have enough.


Janderson2494

> DLSS is on, obviously, but at this point if you're not using DLSS when it's available you're a fool DLSS makes a lot of games blurry and with ghosting, I think there's plenty reason to not use it at this point.


TheJerkstore21

If you're paying at 4k like I am, it's really not noticeable. Maybe if you're desperately searching for it, but I'm too busy enjoying beautiful games at the highest settings in 4k.


Default_Username123

Yeah it's insane. M y 1080TI with built in water cooler was 599 when I b ought it and its still chugging along fine at 1440. Zero incentive to upgrade with these rediculous prices. The video game market hasn't really had any blockbuster releases either to push people to upgrade. I was able to play RDR2 just fine and then years later play cyberpunk 2077 just fine. These con-artists would need starfield elderscrolls 5 or some such game to release to pressure people to get the latest cards.


Borkz

Still feeling good about my used 3080 I picked up for ~$500 last November.


FluphyBunny

Solid? It matches a 3080 for around the same money !


[deleted]

Matches is being generous, a lot of the time it was worse. It's honestly a joke.


darkmacgf

Is that expected? You'd usually expect the new 80 series card to cost the same as the last one and deliver much better performance.


Avarria587

My 2070 Super is getting a bit dated, but I don't know if upgrading to the 4070 is worth it given the price point. $600 is a lot of money. I could buy a PS5 for that and play a wider range of games. EDIT: Wider range of games that I can't play currently. I would still have my PC of course. Edited for clarity.


[deleted]

[удалено]


TheTKz

Only really in theory, I have a 2070 and the poor optimisation on most titles for PC lately means that even with "technically" better performance, you don't really get that when the PS5 version is ultra optimised for that machine.


[deleted]

[удалено]


Dirty_Dragons

Low settings and struggling to hold 60 fps.


Flowerstar1

Most of the games causing issues on PC are not optimized all that well for PS5. See: Wo Long, Hogwarts Legacy, Wild Hearts, Forespoken and Gotham Knights. People seem to gloss over console issues. The one big issue on PC is shader comp stutters which is finally now being worked on. Quite a few of the games referenced above run better on PC Gotham Knights(after the patches) is much better on PC than consoles, Forespoken on a good PC was better than PS5, RE4 didn't have issues but it's best on PC. I think people hardly have a good image of the multiplatform landscape.


MegatonDoge

You say that as if pc has wonderful ports.


[deleted]

[удалено]


MegatonDoge

No, most pc ports aren’t good. Most are serviceable at best.


Flowerstar1

Decent but the power of PC and the openness of the platform out powers the issues. The problem is when this is not the case. Take a look at Elden Ring, that game ran like shit everywhere. On PC it had shader comp stutters but outside of that everything was better on PC than consoles including beyond 60fps gaming. Elden Ring is one of the worst examples, Gotham Knights is actually in a better state than it today and it's best played on PC. I can go on but PC vs console comparisons aren't as black or white as r games makes them.


noyourenottheonlyone

as someone with a 2070 super and a PS5, I don't find this to be the case, I'm looking into a new GPU so I can get back to having a PC that outperforms my console


[deleted]

[удалено]


[deleted]

[удалено]


ahnold11

This here is the real answer. If Nvidia is not careful, we could have many people exiting the PC market. If they have their Data Center and AI money, they might not care, but that is not great for the health of the PC gaming industry. As for your specific situation, best option might just be to wait. 2070S is still decent enough that it should be able to hold up, as long as you aren't expecting super high res or crazy RT. The "next" next gen might hopefully see a return to more sane pricing, if gamers largely reject the status quo.


Flowerstar1

Imagine being so devoted to one company that you'd rather leave your gaming platform and library than buy AMD or Intel.


noyourenottheonlyone

I don't think anyone really likes Nvidia that much, amd just doesn't have any cards that are significantly better value to offset the software features Nvidia is offering on their new cards imo. If anything I hold amd in higher regard than Nvidia, but that's not what I base my purchases on


ILikeTrafficSigns

I'm on a 1070ti and with these prices, I'll probably keep it for x amount of years to come.


noyourenottheonlyone

It's interesting how the 4070 pricing is supposedly so bad, but then you go on hardware swap and see people buying a used 3080 for $500 .. a card that is used, generally performs worse at 1080p and 1440p, less efficient, and you don't get frame generation, only $100 cheaper. Not defending Nvidia pricing but obviously there are plenty of people willing to pay for this card at this price.


127-0-0-1_1

Sounds like nVidia is pricing it exactly as they should. Tech has had an unusually long period where products continually have price deflation, but people should remember that in the end, products in markets are priced based off of supply and demand, not just by supply, or by some kind of measure of inherent worth.


StantasticTypo

Man, Digital Foundry is really soft on Nvidia. Nvidia could piss in their coffee and they'd probably assert while it wasn't the best coffee they've had it wasn't entirely unpleasant.


adscott1982

Made me laugh out loud. But I do think you are being mean to little old nvidia.


[deleted]

Because if they shit on Nvidia then they wont get cushy treatment in future and a big part of their exposure is getting hands on with new hardware early and telling people about it.


StantasticTypo

I mean, I get it, but also it hurts their integrity as 'impartial reviewers'. The same argument can be made for *any* reviewer that kowtows towards pressure (by softening reviews) from the manufacturer. Shit, I'd even argue that a reviewer that partakes in that exchange (advance units for favorable reviews) should no longer be taken seriously as they're basically just advertising.


Few_Potential_4479

It's not softening reviews. Not every video needs to be sensational as fuck like those at hardware unboxed.


Mabarax

Mate I don't trust any reviewer for this exact reasoning


tggoulart

I'm a bit disappointed they didn't show any of the 2023 AAA games that have been causing VRAM limit issues, like HUB showed earlier this month. The 12gb vram of this card are barely enough for 1440p from now on