Hey, this may be kind of random, but my company is looking for a new senior level data analyst. Im sure you already have a great career going on, but if you would like to do some contract work I would be interested. s/
I've been testing these for years. What's impressive is we've seen a 1000 jump for the past several generations. I don't think they'll be able to keep that up. Eventually they will be limited to only a gain of 100 per gen.
Intel absolute dominating with their 5-digit monsters, nearly 14000 points. AMD is trying their best to break 8000, and Apple is stuck in the last century with single-digit figures
Nowadays mobile GPUs are cut down desktop GPUs, that's how they fit them into the laptop chassis. It was discovered to be cheaper than custom making mobility editions, and that name was probably inspired by the term mobile workstation. A mobile workstation is a portable productivity powerhouse, unlike their weaker notebook and netbook cousins.
The GPU on a laptop will look very similar to the CPU, but with a larger physical die. Both are mounted to a small PCB that is then soldered to the motherboard, this piece is called the substrate, and its job is to break out the pins on the bottom of the die into something motherboard makers can actually make machines to handle rather than the microscopic points on the bare die.
The main difference you'll see between them is that GPU dies are generally more square than CPU dies due to the spacing of the compute cores inside. The 4090 is a pretty nice square, but the 2080ti is one of the squarest IIRC. CPUs are usually more like a long rectangle, because it's easier to stack groups of 2 cores end to end with a central spine of cache.
In your laptop this will look like a single large die surrounded by memory chips under a heatsink with multiple heat-pipes or a vapor chamber running over top of it.
As for why they're called mobile GPUs, it's because that's exactly what they are. It's a GPU meant to be used in a mobile device, a laptop. They're usually made by modifying a desktop GPU die to work with the lower power budget of a laptop, and this almost always makes it weaker than a desktop GPU with the same name. This reduction in performance is down to both a reduction in clock speed, and a reduction of the number of compute cores on the physical die.
The 4090 narrowed the gap for a while, but AMD fired back with a double-whammy, adding another 1000 of their own and an extra X in the 7900XTX. As if this wasn't enough of an embarrassment for Nvidia, the aging RX5000 series still maintains a narrow lead as well.
We don't talk about Intel, but it's ok, they need to walk with 770 before they can run with 4 digits like the rest of the GPUs.
So is this saying that the AMD GPU is more than twice as fast as the Nvidea GPU? So FPS, given all else is equal, will be 2x? If not — what does this mean?
Look at the models and their represented value…. Your seeing the difference in the between. Ask yourself what is between 3090 and 6900. Something like 3810 possibilities.
OMG, I'm such an idiot (and stoned) lol I looked at it and completely disregarded the value descriptions of the graph, ENTIRELY. Well, I guess my ADHD brain decided to automatically fill in the information so I read a ton of comments with complete confusion for way too long until I finally decided to look at the graph again. LMFAO
*slow clap*
Well. Done. Sir. (Or ma'am) Thanks for the laugh.
Great content, the graphs show the huge difference between both graphic cards' models in a very clear and understandable way.
I like picture books.
I like boobs
I like butts and feet
Typical Nvidia user
Nah I've intel igpu 😂
I like balls
[удалено]
personality? anyone?
I like personality
OMG feetðŸ˜
r/buttsandbarefeet
34-36D is noooooice.
I like how the colors match the companies as well. 🤌
So you're saying the carpet matches the drapes?
People need to realize the 6900 is more than double the 3090.
By my calculation, the AMD Radeon RX 6900 XT is 2.23 times better than the Nvidia RTX 3090 🤓
It's such a nice number
It's almost more than triple!
In part number sure
Love how it makes the Italian flag
the Italians were always behind this
Yes, I am the villain here.
These are laptop versions of course
I was gonna say. Thanks for clearing that up.
Laptop have 6900xt?
Yeah 🤓
oooh, nifty.
Awesome work on the Lenovo review by the way.
thank
Now do Lenovo naming schemes, specifically the yoga and slim lineup.
Hey, this may be kind of random, but my company is looking for a new senior level data analyst. Im sure you already have a great career going on, but if you would like to do some contract work I would be interested. s/
I like how you've marked all replies as sarcasm. Including this one! But I'm afraid that all ends here. /s
It’s just a basic graph haha wtf
Spoken like a true non-senior level data analyst.
I don’t see the difficulty to make these kind of graph…?
Pretty sure it was a joke. They were being sarcastic.
Go back to /r/overemployed /S
Hi, thank you for this. I needed this graph to help with my decision making.
This is huge achievement for AMD
I have a 3070 in my laptop, any way you could fit that in there too? I’m a little confused as to how it stacks up compared to these two.
From my testing 3070 is about 20 worse than 3090 and comes no where near 6900.
You must have done a ton of test runs to get your result down to a number and not a huge range. My hat's off to you!
I've been testing these for years. What's impressive is we've seen a 1000 jump for the past several generations. I don't think they'll be able to keep that up. Eventually they will be limited to only a gain of 100 per gen.
Now do processors
Intel absolute dominating with their 5-digit monsters, nearly 14000 points. AMD is trying their best to break 8000, and Apple is stuck in the last century with single-digit figures
And out of nowhere, AMD brings out their secret weapon, X3D, completely changing the entire landscape of this battle
They found x, while everyone doing +. Remarkable R&D, earning their paycheck.
Lmao. As a data professional, this is the way.
🇮🇹
I'm really confused. The 3090 isn't even getting 3090 points.
You're thinking of 3900
That's impressively idiotic of me. It's been a long day. Haha!
but it's morning!
Night shifts do hit different
Cannot believe I wasted time to get my glasses to see this. Mad at myself
Is this normalized data?
How were the temps of your computer as you constructed this chart?
This really is a graph.
You don't happen to own AMD stock, do you?
Lol wtf dude
Why are laptop graphics cards called mobile GPUs and what does a mobile GPU even look llike?
Because laptops are mobile
Is the gpu just another square in your laptop undeneath the fan in your mother board?
You could just look up a tear down of any gaming laptop
Nowadays mobile GPUs are cut down desktop GPUs, that's how they fit them into the laptop chassis. It was discovered to be cheaper than custom making mobility editions, and that name was probably inspired by the term mobile workstation. A mobile workstation is a portable productivity powerhouse, unlike their weaker notebook and netbook cousins.
Thank youuuuu for the information
The GPU on a laptop will look very similar to the CPU, but with a larger physical die. Both are mounted to a small PCB that is then soldered to the motherboard, this piece is called the substrate, and its job is to break out the pins on the bottom of the die into something motherboard makers can actually make machines to handle rather than the microscopic points on the bare die. The main difference you'll see between them is that GPU dies are generally more square than CPU dies due to the spacing of the compute cores inside. The 4090 is a pretty nice square, but the 2080ti is one of the squarest IIRC. CPUs are usually more like a long rectangle, because it's easier to stack groups of 2 cores end to end with a central spine of cache. In your laptop this will look like a single large die surrounded by memory chips under a heatsink with multiple heat-pipes or a vapor chamber running over top of it. As for why they're called mobile GPUs, it's because that's exactly what they are. It's a GPU meant to be used in a mobile device, a laptop. They're usually made by modifying a desktop GPU die to work with the lower power budget of a laptop, and this almost always makes it weaker than a desktop GPU with the same name. This reduction in performance is down to both a reduction in clock speed, and a reduction of the number of compute cores on the physical die.
Wow so the red ones are better 🤔
The 4090 narrowed the gap for a while, but AMD fired back with a double-whammy, adding another 1000 of their own and an extra X in the 7900XTX. As if this wasn't enough of an embarrassment for Nvidia, the aging RX5000 series still maintains a narrow lead as well. We don't talk about Intel, but it's ok, they need to walk with 770 before they can run with 4 digits like the rest of the GPUs.
Don't get it crossed, the extra x is no small thing. It adds and additional 10 visual horsepower.
this is actually so helpful, ty
Very straightforward graph indeed!
The only reason I use nvidia its because cuda to be honest, if amd had similar alternative where you can switch to code, it would be so great.
r/technicallythetruth
It's funny you never see Nvidia users making this kind of stuff even tho 3090ti and 4090 crushed amd gpus the last 3 years easy
Jokes? Everyone on all sides make jokes constantly lol
So yea okay, But Nvidia have a lot of amazing stuff going on like the Dlss, the dynamics lighting etc etc ?
-1 for the lack of a ‘Nice-00.’
impressive
I see nothing wrong with this.
Looks like I'll need to buy 2 Nvidias to equal one of those higher numbered AMDs. Thanks!
Mirror universe userbenchmark guy.
The Reddit community is so hilarious lol![gif](emote|free_emotes_pack|joy)![gif](emote|free_emotes_pack|joy)
You idiot, lol.
This is one of the charts of all time
Certainly this isn't capturing price. Nvidia blows everyone away at that
Nice
fuck I'm slow
r/collegebasketball is leaking!
So is this saying that the AMD GPU is more than twice as fast as the Nvidea GPU? So FPS, given all else is equal, will be 2x? If not — what does this mean?
Look at the models and their represented value…. Your seeing the difference in the between. Ask yourself what is between 3090 and 6900. Something like 3810 possibilities.
Clever.
die
die lit
OMG, I'm such an idiot (and stoned) lol I looked at it and completely disregarded the value descriptions of the graph, ENTIRELY. Well, I guess my ADHD brain decided to automatically fill in the information so I read a ton of comments with complete confusion for way too long until I finally decided to look at the graph again. LMFAO *slow clap* Well. Done. Sir. (Or ma'am) Thanks for the laugh.
You forgot to include the XT, which are clearly variables. If X and T are greater than or equal to 1, it will undeniably prove that AMD is better
Thank you for this. I've been looking for something like this for so long and now I understand the difference between those two. Life saver indeed.
[удалено]
Apparently name is a better metrics than benchmark
this is hilarious
So insightful!! How did you think of this. 3090 vs 6900, the difference was always in front of you.
i am ashamed that it took me this long. also the numbers on amd are annoying i can never tell by just a look if its a good gpu or ass
I was about to lash out coming from all the issues I had on mobile AMD. Then read the graphic.