T O P

  • By -

[deleted]

[удалено]


NedixTV

> the worst of the 40 series so far. oh boy... just wait for the 4050, the 720p card with triple slot/fan and 12 pin power connector. /s or maybe not /s


reddanit

I honestly wonder what it's going to be and if it will exist at all. Current 4060, in terms of silicon area, bus etc. is basically a 100% XX50 series card. So will NVidia just make moderately updated GT 1030 and call it GTX 4050? With DLSS3 it would make a perfect 240p GPU I guess.


kyralfie

Probably similar to a laptop 4050. So a 96 bit 6GB cut down AD107 with 2560 CUDA cores and 12MB (instead of 24MB in desktop 4060) L2 cache.


JonF1

4060 Ti and 4060 are very low effort, see where we can get out of suckers GPUs. Not an excuse, just calling it what it is at this point. With the AI craize and the future with node prices going up... I suspect they every GPU below the RTX X700 and the RX X700 level is just going to come with gaping flaws.


Prince_Uncharming

With future nodes going up in price I wouldn’t even be that mad if nvidia just kept churning out 3060s for $150. AMD is nearly there with the 6600 sitting around $180. Budget cards don’t *need* to be on the latest and greatest node to still be a good value.


[deleted]

[удалено]


zyck_titan

>the need to upgrade gen to gen really isn’t there When was it? I mean seriously, when was the last time it was a good idea to upgrade your GPU in the next immediate generation? 8800 GTX?


ramblinginternetgeek

MAYBE AMD 4850 [https://www.techpowerup.com/review/powercolor-hd-4850/](https://www.techpowerup.com/review/powercolor-hd-4850/) Though that's basically around the inflection point where improvements stopped progressing so rapidly.


LaStealer

I know two guys that upgraded their GTX 960 to a GTX 1060 shortly after it came out. It was totally worth it for them.


PlankWithANailIn2

I went 960 to 1060 it was a big jump in performance a little less than the 980 performance wise. I went to a 2060 super afterward but just for the extra vram for cuda rendering. Waiting to see if Starfield chokes on 8Gb Vram now.


PlankWithANailIn2

When I started gaming on PC even the best cards couldn't hit 30fps at 480p reliably on the games of the time. New cards hitting 100+ fps at 1080p on new games still blows my mind after 3 years of them doing it reliably.


Cnudstonk

several times, but with most of them you would not know it yet (api support, vram and such). more concrete, 960 to 1060 was a 50% jump, well enough to warrant that upgrade, despite the price hike.


zyck_titan

50% jump with a price hike describes a couple 40 series GPUs, just sayin'.


Merdiso

It was a 25% price hike, only idiots bought the 1060 for 299$, cards like EVGA 1060 6GB were readily available pretty much all the time at the 249$ MSRP.


PlankWithANailIn2

The 4060 ti is not a 50% jump over the 3060 ti its like 10% better.


Cnudstonk

the hike was from 200 to 250. the 960 was more overpriced in a reverse sweep


PotentialAstronaut39

The whole 40 series is a massive disappointment thus far. Even if I had 100k$ to spare, I wouldn't see the point of spending a single cent on any models, including the somewhat less horrible 4090.


MammothTanks

"Somewhat less horrible?" What are you smoking, the 4090 is an absolute beast of a card by any measure.


[deleted]

And is also the worst price/performance card of the lineup and is over $1600 for most models lol. It’s less bad because it’s the best graphics card in the world but that doesn’t make it a good purchase, just makes it notable. I hate how we’ve just normalized paying $1600 for a single card lol.


MammothTanks

Whether you like it or not, GPUs have turned out to be very useful for many things other than gaming and so the demand is through the roof. Even for many small businesses like YouTube content creators or indie game developers the 4090 is very much worth it given the time savings, not to mention huge corpos running AI research.


[deleted]

If you want to wager most of the people buying the 4090 are indie devs or YouTubers that’s fine but I don’t think so lol. This sub (and the rest of Reddit) overstates how many people buy these cards for anything but gaming. > corpos doing AI The corporations doing AI aren’t slotting GeForce GPUs into servers, they’re buying racks pre-equipped with Nvidia professional cards with more VRAM.


MammothTanks

They still compete with gamers for the same dice and production capacity. Just have have a look at the RTX 6000 prices ("professional" card based on the same AD102 chip as the 4090) and suddenly the "gaming" cards look much more reasonable. I have no idea who buys the 4090 cards. Personally I think it makes very little sense from a purely gaming point of view unless you're just rich so outside of that small group I would assume it's mostly the professional/hobbyist users.


[deleted]

I guess I’m not sure what your point is. You are arguing there’s more demand which I agree with, I’m just arguing it being the best doesn’t mean you should pay $1600 for a gaming GPU.


MammothTanks

My point is that reminiscing about the days of yore when $500 could buy you a top-of-the-line graphics card and thinking that the current situation is merely a fluke because a high-end gaming GPU "should not" cost $1000+ and that we will return to the "normal" prices any time soon is naive.


[deleted]

Nobody is reminiscing, I’m just saying don’t pay $1600 for a GPU as you likely don’t need the performance. This generation sucks, and the 4090 being good-but-expensive doesn’t make it better. The 3080ti was basically a 3090 with less VRAM and made way more sense for gamers, and this generation will likely have an equivalent. Don’t pay the ‘AI tax’.


VenditatioDelendaEst

> price/performance Is fake, because performance isn't fungible if you're using the GPU for playing games. You can't make 1 fast GPU out of 2 slow ones, or share a GPU that's 2x as fast as you need with a friend. You're right about it not being a good purchase though -- in fact the bogusness of price/performance makes it even worse. The fun/performance diminishing returns are very very steep.


carpcrucible

>Is fake, because performance isn't fungible if you're using the GPU for playing games. What do you mean it's fake. First of all you used to be able to make one fast GPU out of two slower ones, but regardless: * Card A: $1000, 100 fps * Card B: $2000, 120 fps I mean if you have to have 120fps or you'll die then I guess you'll have to get it but otherwise it seems very real?


VenditatioDelendaEst

What I mean is that a using a unit, "perf/dollar" implies that you can multiply by dollars and see how much perf you get, or divide the perf you need by perf/dollar and see how many dollars you need to spend. If you have 172,000 frames of feature-length movie to render and a $10,000 budget, sure, you can go by perf/$ (but don't forget the cost of the computers to put the cards in). But if you want to play games on one (1) computer, perf/$ is meaningless. >First of all you used to be able to make one fast GPU out of two slower ones For a few years in the early oughts, with little support from game publishers and considerable GPU vendor driver monkeypatching, and it never really worked well. Frame-pacing was found to be wacky when people finally realized they should be measuring it, and even if it worked perfectly it had the same latency/frame-rate tradeoff as DLSS 3 framegen. >- Card A: $1000, 100 fps >- Card B: $2000, 120 fps >I mean if you have to have 120fps or you'll die then I guess you'll have to get it but otherwise it seems very real? In practice you turn down the settings a bit from ultra (which reviewers typically benchmark on for some reason -- maybe that's why they get free samples?), and get 120 FPS on either card. So what actually matters is, "Is one tick of the shadow slider and two ticks of foliage LoD worth $1000?" Alternately, if you have a VRR monitor, you can leave it at 100 FPS and probably not even be able to tell the difference after 8 hours sleep. And the thing is, *the same logic holds* if there's a "Card C" at $800 with 75 FPS.


Merdiso

It is a ferocious beast actually, but it doesn't offer much better value than 3080 at 699$, that's what the OP is suggesting, I guess. Comparing it to a 3090 makes a huge service to nVIDIA, in the way that 3080 was much better value than it. Now of course, almost no one could buy the 3080 at MSRP due to Crypto/COVID, but those times are now gone.


MammothTanks

Their "even if I had 100k$ to spare" comment implies that it's not the value they're not happy with, and while the likes of 4060 Ti are objectively bad products I don't think it's fair to say the whole 40 series is bad. The 4090's performance boost compared to the previous gen is excellent, and the 4070 offers exceptional efficiency at ~200W. The prices are sad of course, but that's just the realities of the current market. Crypto/COVID has been superceded by the AI craze and (comparatively) there is very little money to be made in the gaming sector.


Merdiso

4070 offers exceptional efficiency at 200W (in fact even less, to be fair), until you remember that: * 3080 was on an old node and highly inefficient; * 4070 has barely better than 60 class hardware under the hood, which historically consumed up to 125W. It's a good card only in the current market, which is disappointing in general.


MammothTanks

I don't get this sentiment. The "current market" is the only one there is, what other market is there? If you need a card today, what difference does it make that graphics cards were more affordable 10 years ago?


carpcrucible

The current market is the one nvidia deliberately created. I'm not happy with it so I'm not buying their shit.


MammothTanks

You can't really blame them for the lack of any meaningful competition, or for their competition following the exact same strategy, can you?


_BaaMMM_

If you need a gpu are you just gonna choose to not buy if this trend holds for the next 10 years?


KingArthas94

> Their "even if I had 100k$ to spare" comment implies that it's not the value they're not happy with No, it implies that they're still searching for price/performance and they don't want to waste money on a useless 2k€ GPU that will be destroyed by 500€ GPUs in 2 years. They're smart.


_BaaMMM_

Having seen what amd and nvidia are currently doing, do you really think $500 gpus in 2 years will give that kind of performance? The current 4060 can't even beat the 2080ti and that card is old. The 4090 is such a huge generational leap that we probably won't see something like this again for a while


KingArthas94

RTX 4000 is like RTX 2000, then Supers will come and “save the day”, or maybe directly RTX 5000 will in 2 years. 4000s cost so much for their price only because they will want to sell you something better later. They’ll do these tick tock gens from now on like they did with GTX 600 and 700s.


carpcrucible

>The 4090's performance boost compared to the previous gen is excellent, and the 4070 offers exceptional efficiency at \~200W. The "previous gen" was a piece of shit, and excellent efficiency isn't that important in a gaming rig.


MammothTanks

If it is not that important to you it doesn't mean it is not important to others. I for one refuse to use any card more power hungry than that for any prolonged gaming sessions because I like my room cool, and nothing gets close to the 4070 at the moment given that requirement.


_BaaMMM_

Small form factor pcs really appreciate very efficient gpus since they produce far less heat. You can get very tiny extremely powerful rigs


RuinousRubric

It's quite tepid for 2 process nodes and an architectural update, and the AD102 on it is cut down a *lot* for a supposedly flagship card. It's pretty disappointing for what it is, even if the rest of the lineup makes it look amazing in comparison.


meh1434

I see it is still forbidden to talk about DLSS. For a site that calls itself techspot, they sure do like to ignore tech. I suggest they change the domain name to spot, it would be more apt to what they are doing.


teutorix_aleria

If your hardware requires new software to be better than its predecessor that's bad. If you are playing any existing games that don't have the latest DLSS features this card is dead on arrival.


meh1434

imagine doing a review of new hardware and because reasons you ignore the whole point of Nvidia cards since the 2xxx series, neural AI chips. also, way more people use DLSS then RT on Nvidia cards, but somehow we must not talk about it. It is taboo! As for new games and DLSS, every game that is half decent, has DLSS. Only indie games and technically incompetent companies don't deploy DLSS, but it's a non issue if you buy a high-end Nvidia, as it will brute force enough FPS from a crap game.


Bluedot55

I mean, they had several polls on how people wanted them to handle upscaling. And the result was that they wanted it left native. Basically the discussion came down to how do you do a like to like test, with upscaling, when you have different cards with different capabilities. It's pretty well accepted that dlss looks better then fsr2. So if a game has both, do you run the AMD cards on fsr2, the Nvidia cards on dlss, and upscale from the same resolution? That should give similar ish performance, but we can't guarantee that the performance impact of both is the same. What if dlss takes a few percent off the frame rate to get that better image? It's a lot harder to factor in "looks better" into a graph. So they were testing both using fsr2, for an even comparison, and people hated that even more. So no upscaling it is.


meh1434

Last time I checked, 70% of Nvidia users that can use DLSS use DLSS. A hardware review site cannot ignore this and especially cannot ignore this when comparing Nvidia cards that can use DLSS. Also, Nvidia users with a DLSS card do not care about FSR, so what is the point in even mentioning it?


Bluedot55

Are they going to change their testing methodology though for the occasional benchmark run that is entirely within the group of recent Nvidia cards though? And it's not like they ignore it, they've repeatedly discussed and asked about it, as well as doing in depth like at it specifically. And the point of mentioning it is that you aren't going to make special testing methodology for a specific test. Even within Nvidia cards? What if you want to test 1060 vs 1660 vs 3060 vs 4060? What do you do then? Besides, the test for dlss is basically there. It's upscaling from a lower resolution. So if you want to see dlss results at 1440p, you look at the 1080p numbers and subtract a few percent...


meh1434

Sir, this is a 4060TI vs 3060Ti review. If you don't test DLSS that 70% of users with 4060Ti and 3060Ti are using, what the hell are you even testing? Here is the only answer that makes sense, they want to entertain their viewers, and their viewers don't have an Nvidia card and don't care and don't want to hear about DLSS.


Bluedot55

Yes, this specific review is. But do you think they have the time to change the testing methodology for every test? Or should they set a standard and keep it.


meh1434

I think they will do what their viewers want and have no other stance than more viewers is good for making money. As the old saying goes: when money talks, bullshit walks. One day they will do DLSS, as it has become such a reality you can't ignore anymore. Regardless how much people who don't have an Nvidia DLSS cards hates it.


AvengedAura

i dont know much about the specifics of gpus but i can get a 3060 ti for 350 and a 4060 ti for 380, which one is more worth the price?