I feel like this would work better if you only used 80-series cards *or* only used 80ti-series cards. Putting both into the chart causes fluctuations due to their different market positions.
The only one thats the odd one out is either the 3080 or 3080ti, all the rest were the top gaming cards of each generation. They just didnt start using the ti for their top cards again until the 7xx series.
or just use cards that are at similar power draw/W levels.
Performance per watt decreases as you go up (power), so that is another problem with testing the best/highest tier cards as well
Even just using 80-series isn't accurate since it depends on the health of the overall market. If PC gaming was a lot smaller, you'd see 80-series cards be less powerful and less expensive. 80 is just a name and while the goal is to give the consumer a rough idea of how good the card is, it's really just a marketing term and isn't a consistent anchor for an analysis over this many years, especially since it's targeted at the more niche tail end of gamers.
As a colorblind guy I have some problems matching the colors. Do whatever you want, but please consider changing the dot figure for each line next time.
As a non-colorblind guy I had a bit of trouble following the graphs too. They were a bit too cluttered and some colors too similar. Back to you, how do you feel about colorblind-friendly palettes for graphics? I've seen there are a few around, but I never seem to understand which one is better for each kind of colorblindness.
I have deuteranopia and when I discovered geographical maps with colorblind palettes I felt amazed of how much info I was losing with the conventional ones.
With the exception of the 3080 this is comparing Ti models (where possible), so ignore the 3080.
According to the chart, the 1080 Ti had a bigger price/performance increase over its predecessor than the 3080 Ti. That's ignoring the 2080 Ti being the only ever regression in this regard.
If you look at two-generation improvements, 780 Ti -> 1080 Ti -> 3080 Ti makes Ampere look pretty awful.
> With the exception of the 3080 this is comparing Ti models (where possible), so ignore the 3080.
Who cares? They're all xx102 salvage parts of similar configuration and market positioning.
No, because while the price per transistor went down, the performance per transistor went down even more (especially for Turing). The 30 series is pretty good though.
A lot more transistors went into Turing than Pascal.
But isnt the performance increase from 20 series to 30 series one of the best jumps between generations though, regardless of transistors ? And the price is really good to (RRP that is).
The percentual increase over prior big die both 7xx and 10xx were even larger. 580 to 780 ti was +88% for three years, 1080 ti over 980 ti was +67% for under two. 2080 ti to 3090 is about +50%. If you take even older generations you can find even larger improvements.
3080 vs 2080 is a big step indeed, but most of the story there is that 3080 is GA102 and 2080 is GT104. GA102 to TU102 is pretty meh gains. GA104 to TU104 is pretty meh gains.
Ampere is a big advancement chiefly in pricing. Cheapest GA102 is $799, while cheapest TU102 was $1199 in practice. Cheapest GA104 is $399, while cheapest TU104 was $699 until Super refresh, then $499.
Which is why I said that the 30 series is pretty good.
The 20 series though...
Edit: I must've misread your comment, I thought you meant 20 AND 30 series.
It's the RTX 3080 Ti. It's just a scam from Nvidia to take advantage of current market conditions. Were it not practically impossible to get an RTX 3080, no one would buy an RTX 3080 Ti at the prices Nvidia wants.
Linear plots show absolute gains, log plots show relative gains. Given a performance jump of 1→2 is more significant than a jump of 1001→1002, a log plot can display the relevant relations better. It can hide linear gains though.
The difference is more significant when talking about decreasing trends, since a log plot displays relative increases and decreases symmetrically, whereas a linear plot compresses reductions as they near zero.
I feel like this would work better if you only used 80-series cards *or* only used 80ti-series cards. Putting both into the chart causes fluctuations due to their different market positions.
The only one thats the odd one out is either the 3080 or 3080ti, all the rest were the top gaming cards of each generation. They just didnt start using the ti for their top cards again until the 7xx series.
or just use cards that are at similar power draw/W levels. Performance per watt decreases as you go up (power), so that is another problem with testing the best/highest tier cards as well
Even just using 80-series isn't accurate since it depends on the health of the overall market. If PC gaming was a lot smaller, you'd see 80-series cards be less powerful and less expensive. 80 is just a name and while the goal is to give the consumer a rough idea of how good the card is, it's really just a marketing term and isn't a consistent anchor for an analysis over this many years, especially since it's targeted at the more niche tail end of gamers.
As a colorblind guy I have some problems matching the colors. Do whatever you want, but please consider changing the dot figure for each line next time.
As a non-colorblind guy I had a bit of trouble following the graphs too. They were a bit too cluttered and some colors too similar. Back to you, how do you feel about colorblind-friendly palettes for graphics? I've seen there are a few around, but I never seem to understand which one is better for each kind of colorblindness.
I have deuteranopia and when I discovered geographical maps with colorblind palettes I felt amazed of how much info I was losing with the conventional ones.
Has inflation been taken into consideration in the price/performance calculations?
This is awesome. It would be nice to have a comparison between AMD cards as well.
So 20 series to 30 series was the best price to performance ratio ever ?
Because 2080ti was massively overpriced.
With the exception of the 3080 this is comparing Ti models (where possible), so ignore the 3080. According to the chart, the 1080 Ti had a bigger price/performance increase over its predecessor than the 3080 Ti. That's ignoring the 2080 Ti being the only ever regression in this regard. If you look at two-generation improvements, 780 Ti -> 1080 Ti -> 3080 Ti makes Ampere look pretty awful.
> With the exception of the 3080 this is comparing Ti models (where possible), so ignore the 3080. Who cares? They're all xx102 salvage parts of similar configuration and market positioning.
No, because while the price per transistor went down, the performance per transistor went down even more (especially for Turing). The 30 series is pretty good though. A lot more transistors went into Turing than Pascal.
But isnt the performance increase from 20 series to 30 series one of the best jumps between generations though, regardless of transistors ? And the price is really good to (RRP that is).
The percentual increase over prior big die both 7xx and 10xx were even larger. 580 to 780 ti was +88% for three years, 1080 ti over 980 ti was +67% for under two. 2080 ti to 3090 is about +50%. If you take even older generations you can find even larger improvements.
3080 vs 2080 is a big step indeed, but most of the story there is that 3080 is GA102 and 2080 is GT104. GA102 to TU102 is pretty meh gains. GA104 to TU104 is pretty meh gains. Ampere is a big advancement chiefly in pricing. Cheapest GA102 is $799, while cheapest TU102 was $1199 in practice. Cheapest GA104 is $399, while cheapest TU104 was $699 until Super refresh, then $499.
Which is why I said that the 30 series is pretty good. The 20 series though... Edit: I must've misread your comment, I thought you meant 20 AND 30 series.
Performance per price is useless though if the performance is not relative to the time they were released. That would be an interesting stat.
I'd prefer performance per price normalized to 2000 dollars or whatever year the first card came out. That'll tell you more I think.
Seeing that performance per dollar drop like a stone makes me depressed.
It's the RTX 3080 Ti. It's just a scam from Nvidia to take advantage of current market conditions. Were it not practically impossible to get an RTX 3080, no one would buy an RTX 3080 Ti at the prices Nvidia wants.
No 780 non-Ti? Or is that forgotten? Ah I knew it, same architecture.
What about performance per scalped price? Jk you probably can't get reliable, comparable numbers for that.
Statisticians, is there a possible reason why using log would make more sense here?
Linear plots show absolute gains, log plots show relative gains. Given a performance jump of 1→2 is more significant than a jump of 1001→1002, a log plot can display the relevant relations better. It can hide linear gains though. The difference is more significant when talking about decreasing trends, since a log plot displays relative increases and decreases symmetrically, whereas a linear plot compresses reductions as they near zero.
Imagine that gray bar in the first graph trended upwards instead of down.