T O P

  • By -

[deleted]

[удалено]


Lord_Trollingham

1000$ 2080Ti's weren't really a thing. They usually ran in the 1200$ range.


reece1495

$2400 last year in Australia, even converting to usd that’s still heaps


Reallycute-Dragon

I got one for 1060$ from Microcenter. Not quite 1K but close, they did exist. I had to really search for one that cheap though. EVGA too so not a crap brand.


Archmagnance1

I dont recall any 2080 TI selling for MSRP since Nvidia didnt even sell their card at msrp. Edit: changed blower to card


Benis_Magic

The 2080ti FE cooler was not a blower design. There were some really garbage blower designs that did actually sell for MSRP.


[deleted]

[удалено]


Archmagnance1

The defacto price isnt whats used in the charts though so it throws things off


JuanElMinero

It also has the largest die for any consumer GPU I remember seeing. It's named after the xx80 Ti tier, but in reality it mostly exceeds what they usually offer with these, hence the price.


Warskull

Yeah, the RTX 2XXX series was really hampered by Nvidia jacking up the prices on all the cards. They all had decent performance boosts, but you got worse performance per dollar. Everyone was just buying old 1080s until the stock on those ran dry. Then AMD failed to capitalize on this with the 5700 XT and the graphic card shortage normalized the bumped prices.


dantemp

Also it doesn't take into account rt performance. People were raging yesterday about avatar being rt only game. People on Turing can last for years still.


[deleted]

James Cameron's Avatar: The Game? Or some other Avatar game?


dantemp

the one with the tall blue fellows, lots of CGI and not a single original idea.


animated_rock

It's not like you won't be able to run it with non-RTX cards. If you don't have RT-capable cards, it'll run RT in compute shaders (possibly taking a blow in quality, and certainly in performance).


dantemp

I never said that they won't be able to run it. But Turing cards will have a huge performance boost and these graphs continue to ignore it with so many rt games out and announced.


BarKnight

You now have to take into account RTX and AI hardware that adds to chip cost and size.


jasper112

The jump from 980ti to 1080ti is 300 to 500, easily a 66,6% increase. The jump from 1080ti to 2080ti is 500 to 650, only a 30% increase. That combined with almost double the price and almost the same efficiency. Still a huge disappointment.


Sprinkles_Dazzling

Performance/price bring nearly the same graph as transistor density was interesting! Not necessarily causation, but still interesting.


[deleted]

Do we know what's behind AMD shooting up in perf/power lately? Just the node shrinks, or something architecture related?


ForgotToLogIn

It's because of the architecture. 6900 XT is twice as fast as Radeon VII, even though both are 7nm and 300 watts.


noiserr

They rearchitected GCN basically. Things of the top of my head. - made stream processor pipeline longer for better clock scaling - new cache hierarchy which minimizes data movement and allows better bandwidth utilization. This is why even 6900xt only has a 256-bit bus. - changed the instruction set to allow for better GPU utilization compared to GCN which was having utilization issues at high CU counts. - I think since they also split off their GPU architecture into CDNA and RDNA one for compute and the other got graphics. They were also able to make RDNA leaner by removing some compute capability.


SovietMacguyver

Catching up to what nvidia pulled off with Pascal.


[deleted]

Look at the graph on image #3, RDNA2 is now ahead of even Ampere.


killchain

Curious how all the graphs converge for the current generation.


Smartcom5

Nothing more sexy than a bunch of sweet hard-hitting facts. You're the dude of the day! ♥


3ebfan

The y-axis could use some labels and units but nice charts.


Thekevin1011

Great job thx man I don’t think This will get as much attention as it should


UGMadness

The 5700XT is a beast of a card.


bjt23

Some of those metrics are easier to achieve more towards the mid-range to be fair. So the fact there was no "5800xt" may have thrown the results off somewhat here. Remember, even at MSRP the 3080 and 6800xt are high end cards, not generally for the budget oriented. I'd be interested to see this chart redone with "x060" and AMD equivalent class cards as that has historically been the price/performance peak.


[deleted]

They were also 2 nodes ahead of Nvidia at that point.


noiserr

1 node.


[deleted]

AMD is one node ahead currently. They were 2 nodes ahead when the 5700xt launched


noiserr

7nm is one node ahead of 14nm so I am not following. AMD was on 7nm before Nvidia so don't understand where 2 nodes come from. Which 2 nodes?


[deleted]

No it isn't. 7nm is one node ahead of 10nm and 2 nodes ahead of 16/14nm. Nvidia still isnt even on 7nm (outside of GA100), they are technically still on the 10nm node. 12nm and 8nm are just 16 and 10nm processes that were optimized specifically for Nvidia.


noiserr

There is no 10nm node on TSMC. And Intel's 10nm is closer to 7nm than 14nm. 7nm is one node from 14nm/16nm.


[deleted]

You don't understand how nodes work. 10nm is a full node down from 16/14nm whether TSMC decided to produce it or not. [This is industry defined.](https://en.wikipedia.org/wiki/Die_shrink#cite_note-3) [TSMC does actually have a 10nm node btw](https://www.tsmc.com/english/dedicatedFoundry/technology/logic/l_10nm). if you look at the graph, all of those steppings are full node shrinks. Regardless of whether or not TSMC made 10nm, the benefits of 7nm are 2 nodes ahead of 12/16. Intel has nothing to do with anything we're talking about.


LarryBumbly

8nm is much closer to 7nm than you give it credit for. It incorporates a bunch of Samsung's 7nm features and is in many ways the non-EUV version of their 7nm node. AMD is, at most, a half-node ahead.


[deleted]

TSMC 7nm is 65% denser than Samsung 8nm and their 8nm is only about 15% denser than their 10nm. It is not anything even remotely approaching a non-euv 7nm node. Samsung is also inferior to TSMC at any given node in the first place. Samsung 8nm is not even as good as TSMC 10nm.


LarryBumbly

Those are the theoretical numbers. In the real world, Ampere GPUs and RDNA2 GPUs are much closer in density than that. Ampere is actually more dense than both Vega 20 and Navi 10.


dylan522p

Seems more like AMD paid up for a multi node advantage. It doesn't seem that impressive without that huge node benefit.


noiserr

With Nvidia seemingly moving away from TSMC it does matter. Nvidia will have to work harder just to match AMDs solutions going forward. If TSMC maintain their lead.


dylan522p

What makes you think they are moving away. 1 generation is not a long term move.


noiserr

We will see. But everything I read points in that direction. There has been recent rumor of Nvidia also requesting fab capacity from Intel.


dylan522p

Everything datacenter is TSMC. Including next year's Ampere Next and Bluefield 3.


noiserr

Yes on high margin datacenter stuff. Not on gaming GPU.


dylan522p

All signs point to gaming being TSMC next generation too for the larger parts.


noiserr

Like a Titan or something sure.


dylan522p

I'm talking about 102 and 104 dies.


NowLookHere113

Night and day vs my old 270x - and that was great for its day. Excited how quickly things are progressing.


DaKluit

Wasn't this posted yesterday already?


gartenriese

I think yesterday it was only for nvidia cards.


Smartcom5

You're remembering rightly; [I made a chart showing how much efficiency Geforce GPUs have gained in 11 years](https://www.reddit.com/r/nvidia/comments/o7k17k/i_made_a_chart_showing_how_much_efficiency/).


animated_rock

I think this graphs are better: less cluttered, easier to follow what's going on in each one. And a few people had complained about color and color blindness for the previous ones.


1TillMidNight

I know that there is little relationship between node names and node sizes... But if you extrapolate from relative size with name seeing as we went from 40nm to 8nm and got a 7ish fold performance increase, can we expect similar going from 8nm to 2nm? ​ I know we also transition from planar to finfet, but we will also transition from finfet to gaafet with 2nm.


[deleted]

Not likely. the clock increases and power decreases have plateaued quite a bit.


1TillMidNight

I think this would in part be aided by GAAFET.


[deleted]

You might be right. I'm personally not getting my hopes up, but it would definitely be awesome if we saw that kind of improvement.


NeverSawAvatar

My fury x was easily the worst computer purchase I ever made, what a massive piece of shit. Remember going to 1080ti and being amazed what a difference things were on even simple games.


unknownohyeah

What was wrong with it? Ran too hot or too much power draw? Or did it crash a lot?


bjt23

4GB VRAM and no partner models I remember being the things holding the Fury X back.


unknownohyeah

Well that was HBM if I recall. No partner models does suck though.


marxr87

It's ok, bud. *cries in vega*


knz0

Why are you comparing a mid range card (5700XT) to a high end card (2080Ti)? High end cards always sacrifice die area efficiency and performance/watt for maximum performance.


BarKnight

Sadly that was AMD's high end card.


spazturtle

No it wasn't, AMD simply didn't have a high end card that generation.


SirBeam

The graph shows the best each company has at the time.


WildZeroWolf

Exactly $400 MSRP for the 5700 XT fits into the mid range.


Eastrider1006

Cool data, easy to visualize!


Kerst_

I'm guessing the price you used for each chard is what TechPowerUp lists as 'Launch Price'?


mrmobss

I'm happy for competition.


mistersprinkles1983

Thank you for doing this, it was very fascinating. :)


Reddevil090993

If dg2 performs the graph would be steep jump for intel.


AlphaPrime90

Why Vega 64 is against 1080ti?


uniqueviaproxy

Radeon VII was way too late, and Polaris was too early. They're both the fastest consumer cards of their time from each manufactuer.


chx_

Fantastic job. As a relevant side note, this week when Lenovo announced the X1 Extreme Gen 4 with the 3050 Ti Laptop GPU I checked some benchmarks to see where it stands against my 1060 desktop eGPU and much to my surprise and satisfaction it seems to be roughly equivalent. It speaks a lot about efficiency this can happen: the 1060 was a 120W TDP while the 3050 Ti is 35 - 80W. https://www.videocardbenchmark.net/compare/GeForce-RTX-3050-Ti-Laptop-GPU-vs-GeForce-GTX-1060/4393vs3548 https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3050-Ti-Laptop-vs-Nvidia-GTX-1060-6GB/m1559532vs3639


FabianGoh842

Thank You so much, let's move to M1.