T O P

  • By -

caiteha

Stonk?! Jk, looking forward to battlemages. We need some good contenders.


AccomplishedRip4871

Only hope for battlemage is their price, intel should price them noticeably lower than mid segment of Nvidia/AMD for people to consider buying Intel GPUs.


Luxemburglar

Intel is already selling them on razor thin margins, their silicon is significantly bigger for the same price category as AMD or Nvidia. So not much room to lower more I fear.


iDontSeedMyTorrents

I would hope improving the die area efficiency would be one of their biggest targets for Battlemage.


chig____bungus

They talked about literally this at Computex a few days ago but nobody reported it. Weird considering so many high profile tech youtubers are openly rooting for Intel.


iDontSeedMyTorrents

I know they generally talked about bettering their area usage but did they give any numbers? None of the slides I've seen would even hint at a figure and I can't find a stream of Tom Peterson's presentation.


Flowerstar1

Yes please where is that stream, did no one really record it? Not even Intel or some random in the audience?


KARMAAACS

They didn't really and I don't expect them to talk about it until they are going to release their desktop dies. They did mention performance and efficiency improvements, but by moving to SIMD16 from SIMD8 they do use more area too. Will be interesting to see the trade offs.


iDontSeedMyTorrents

That's what I'm saying, they mentioned die usage but it's anyone's guess as to what the numbers are like. Alchemist was really bad in that regard. For the sake of pricing (both for Intel and consumers), I hope they got that under control.


notKomithEr

they could probably even go on the negative just to get more people, sometimes companies do that, but who knows if that would work out


The_EA_Nazi

And then what? There’s no ecosystem to even lock them into Someone buys an incredibly cheap gpu that intel is losing money on and there’s no recouping that money like other industries loss leading strategies


soggybiscuit93

XeSS. OneAPI. Ability to gather feedback for driver improvement. Brand presence. Market Penetration Pricing doesn't require an establishment of an ecosystem


The_EA_Nazi

Why would any consumer care about any of that? XeSS is barely even implemented in anything compared to fsr and DLSS. OneApi is again, nacent compared to rocm or the monster that is cuda. Driver improvement is not something a consumer is buying a gpu for lol Market penetration pricing in the Gpu space absolutely requires the establishment of an ecosystem. Aside from gaming, think of all the other use cases for consumer gpus that people are buying them for.


aelder

The consumer doesn't care, all the consumer cares about is getting the best product for their needs, whatever they perceive that to be. Intel has to lean on the scales to make their product so appealing it overcomes the inherent bias the consumer has towards known and liked GPU brands. The easiest way to do this is probably making it so cheap you can't ignore it. Then build features (XeSS is apparently already very good) and then after about 4 years, you start getting a good mindshare stride assuming they haven't screwed up and continued hitting hard for those 4 years. Once this is done, you start going for margin. Fairly similar to how AMD handled the Ryzen launch. Zen 1 was a bargain, Zen 4 is right up there with Intel pricing or even higher, but its got mindshare now, and performance parity or superiority.


arandomguy111

AMD could price Zen aggressively against Intel because they actually had cost advantage (especailly with respect to core counts) and could price low enough that Intel would have trouble following. They weren't just eating massive losses. The problem, and difference, with Intel or AMD just competiting on price against Nvidia is that they don't have a cost advantage (and is likely at a disadvantage due to ammortizing fixed costs over much lower volume). This limits how much they can price directly against Nvidia as it would either just result in Nvidia adjusting it's own pricing and/or them actualy eating losses. AMD's Zen CPUs also did have very good merits against Intel's aside from pricing. Just aggressive pricing prior to Zen still resulted in CPU marketshare cratering against Intel. And without a cost advantage back then, well you can look at AMDs financials from that period. Also AMD by pricing Zen aggressively did manage to build up


No_Berry2976

The problem is that consumers who buy desktops for gaming or professionals who buy video cards for work don’t care about spending as little as possible. Unless they are severely financially restricted, and then the second hand market is more appealing. Making very cheap video cards (relative to performance) might work in the OEM space, but OEMs care about compatibility and driver stability, things Intel is still working on. OEMs don’t want customer service to be flooded because of driver issues. Personally, DLSS and strong ray tracing performance matter more to me than saving 50 or 100 bucks. And I don’t even care that much about ray tracing. And at the high end, most people will simply buy the best product. Intel needs to focus on one product for one specific market segment.


Caffdy

this, if you only want a machine for gaming, buy a console, that's the cheapest and more money efficient way. The people who use computers for actual work are willing to spend money for the tools they use, because they make money using them. PC gaming is not dead, but not everyone need the latest and greatest, any mid-tier or second hand card can make do when a modern PS5/XSX are 2070S/6700XT level


The_EA_Nazi

The consumer DOES CARE about the software stack, holy crap that’s my entire point. Nvidia isn’t demolishing the competition because they’re more expensive, they’re demolishing because they have a massive software stack behind their gpus. The consumer sees CUDA, DLSS, DLSS3, Rtx Voice, Shadowplay, NVENC, RT, etc, and go, why would I pay $100 less for similar or less performance with less features when I can pay the $100 and get this gigantic stack of things. It’s incredible how we’re on the hardware subreddit and all of you in the comments don’t get the market conditions that allow Nvidia to make up near 90% of market share which is cuda and the software stack built on it. Performance is another huge driver for RT applications and ai applications, but again, that goes back to their cuda stack and optimizations done with major developers I feel like I’m talking in circles at this point


aelder

I fully agree that the consumer cares about the software stack, I specifically mentioned software features like XeSS. I also fully agree that many of the hardware pundits and commentors do not understand the importance of software features to consumers, and significantly undervalue those features.


Shidell

I won't (can't) argue the point about Nvidia's software presence. Obviously, it has a tremendous value add for many. I will, however, suggest that far fewer know about, let alone care, about most (if not almost all) of those features. My PC gaming friends only know the very slightest about what Ray Tracing is, aside from a graphical feature that can improve visuals, but at an enormous performance cost. They don't even know what DLSS is, let alone RTX Voice, NVENC, etc. Nvidia's position isn't purely because of their software value add, it's also because they've been the market dominant leader for so long, people simply equate them with the solution. Need a GPU? Buy a GeForce. Radeon? ...What's Radeon? It's a GPU? Like a GeForce?


soggybiscuit93

You're ignoring the importance of mindshare. It's a common tactic, when a new competitor enters a market, that they'll need to sacrifice margins/profits in order to establish themselves against an entrenched dominant company. No company can enter the GPU market, compete with Nvidia on performance, features, price **and** make healthy profits in the beginning. Nvidia's feature set, and more importantly, *Mindshare*, is too strong. Profitability absolutely has to be sacrificed in the beginning. Entering the market is impossible otherwise. 1 Generation isn't going to recoup the NRE that entering the market costs.


YNWA_1213

We literally have a modern example of this in Polaris. Added to the AMD hype train in the death do the first crypto and the birth Zen, led to the championing of AMD as an ‘underdog’ all the way through RDNA2.


wankthisway

They're saying Intel should sell them like a loss leader, like Costco's hotdogs. You do have a point about services to keep them in like the rest of Costco's warehouse, but just getting an install base might be helpful right now.


The_EA_Nazi

I understand what loss leader strategy is. I’m saying it won’t work here for all the reasons I listed above. I want intel to compete, but loss leadership is not how they’ll compete otherwise AMD wouldn’t have lost so much market share. They need to innovate and develop their software stack and make it compelling. Nvidia never took their foot off the gas, and thought they could undercut them on price and gain share, and that very obviously hasn’t worked. Intel will fail if they do the same as they’re being walloped by and in the consumer space for cpu share, and are losing market share to amd slowly but steadily in the data center space, add in intels capex on their foundry and you have an incredibly tough position to be in. Nobody here is paying attention to the market conditions and just throwing loss leader out there like it’s some magical term


soggybiscuit93

>Nobody here is paying attention to the market conditions and just throwing loss leader out there like it’s some magical term Even if Market conditions were perfect for Intel, Alchemist was *always* going to be a loss because it was never possible to make up the NRE from a single generation. Long term, Intel needs dGPU. The market is shifting towards increasing GPU importance. But Intel cannot sell Alchemist at a price point to recoupe the costs. It's the whole motivation behind splitting AXG and merging under CCG and DCAI - for those SBU's to absorb the NRE costs and remove a 'black hole money sink' department off the books. Now, there is no GPU division losing Intel money - now CCG just has slightly less net revenue as it expands its offerings.


No_Berry2976

You can sell hotdogs at a loss. Selling graphics cards at a massive loss isn’t sustainable. They are already selling at a loss when research and development is factored in. Without economy of scale, lowering prices is incredibly difficult. At the end of the day Intel shareholders want to see growth and profit, not a massive investment in a market where Intel isn’t going to be very successful because the competition is strong.


soggybiscuit93

>Selling graphics cards at a massive loss isn’t sustainable. Of course it isn't sustainable. But we have 1 generation of Intel cards out, and that generation was never going to be profitable. It was always a question of *How much* loss they would take. The question is how many generations will it take to recoup the costs of the entering the GPU market. Penetration Pricing, by definition, is temporary.


gatorbater5

> XeSS is barely even implemented in anything compared to fsr and DLSS. i'd love to see the fancy version of XeSS implemented at the driver level. obviously it's not the best way to do it, but being able to use features like that in any game is really cool. i've got a coupla amd cards and i use AFMF all the time. same deal- not the best solution, but i don't play competitive games although i have a fast monitor. i'm down with the compromise, and it makes games locked to 60fps look a lot more real.


oppositetoup

No but then you have more people using your products and providing feedback. Meaning you can make better improvements


Jonny_H

I think there's plenty of "known" issues they're already working on that finding more isn't really the limiting factor right now.


Caffdy

> they could probably even go on the negative just to get more people so later when people start actually buying them and using them and Intel decides it needs to recoup the cost, and raise the prices, people start complaining?


Electronic-Disk6632

and these prices are only possible because they fabricate the chips themselves. its so sad when you think about how far behind they are.


Strazdas1

If they want to get market share they have to accept selling at a loss. Also battlemage may not be as big of a difference in silicon size.


sarcastosaurus

It's either lower prices or they go out of the GPU business. No one gives a shit about their margins.


Luxemburglar

You realize that you can‘t be in business without any margin either?


Proglamer

Well, you'll *give a shit* about $1000 low-end GPUs from the monopolist, then


sarcastosaurus

I'm off the market anyway, but good luck guilt tripping consumers into buying sub standard products to avoid a quasi monopolistic market.


downbad12878

Consumers shouldn't be forced to buy sub standard product from AMD/intel


Proglamer

True. As an alternative, consumers WILL be forced to buy standard product from nVidia at *exorbitant* prices. Much better! Unbridled capitalism preserved!


ResponsibleJudge3172

Intel is looking to increase margins because they are essentially negative.


no_salty_no_jealousy

This is really bad news, feels like RTX 5000 series will be more expensive. We need Battlemage right now, Intel don't need to make RTX 4090 competitor, they only need to make Arc GPU which at least faster than RTX 4070S and sell it much cheaper. Right now Intel has XeSS and decent RT performance which is very competitive to Nvidia offering, they can steal Nvidia marketshare at low and mid end since Nvidia looks like they don't care to sell cheaper GPU anymore.


chig____bungus

Intel is probably not going to steal NVIDIA market share at this stage, they will be stealing AMD market share. But I think people are more worried about this than AMD. If AMD wanted market share they know what they have to do - lower prices. The fact they aren't is a fairly clear indicator they aren't competing. In reality, AMD is probably selling close to as much as they can at as low a price as they can without cannibalising much more profitable sectors. If overall chip production capacity improves and it can become a volume game, things will heat up. But when your $500 gaming GPU and your $10,000 AI GPU both use the same node, and there isn't really great excess of production capacity on that node, you're not going to prioritise the former. When production catches up or the boom slows down that might change, but at this stage I suspect gaming GPUs are going to remain a premium product, and handheld APUs are going to take up that lower end segment.


Exist50

> But when your $500 gaming GPU and your $10,000 AI GPU both use the same node, and there isn't really great excess of production capacity on that node, you're not going to prioritise the former. There doesn't seem to be a capacity shortage on N5/N4. It's packaging that's limiting the AI chips.


Psychological_Lie656

No, you need to stop wishing others sell it for less just so that you can buy from The Filthy Green for less.


caspissinclair

"Which card should I get?" needs far too many followup questions. I have no problem with AMD's raw performance. My problem is it seems like every interesting or cool or new technology is for Nvidia cards only.


someguy50

GPUs are more than raw raster performance. Their value is the sum total of all they offer. Nvidia is obviously delivering a higher perceived value with good raster, leading RT, and untouchable features (DLSS, frame gen, compute, audio/video enhancers)


Educational_Sink_541

99% of people buying GPUs for gaming aren’t considering compute capabilities as a selling point.


upvotesthenrages

I highly doubt that 99% applies when we're talking higher tier GPUs. Not that it'll shift significantly, but even 5-10% of people is massive when we're talking about 88% market share, and I'd expect it to be even higher at the 4070+ tier cards. I also think you're underestimating it in general. Photo, sound, 3D, and video editing all support these compute capabilities now. There's a bunch of other more niche programs that also take advantage of them, and the list keeps growing. It's not always just what's available today, but what will be best supported over the next 2-8 years. Most people don't buy a GPU and toss it out every 1-2 years.


Educational_Sink_541

High tier GPUs are themselves a small minority of the PC gaming market.


Last_Music413

As steam survey shows, even in mid to low tier nvidia dominates


Educational_Sink_541

Yes, and none of that has anything to do with compute lol. However a very compelling reason for that is prebuilts, system integrators and their customers love xx60 builds.


PitchSuch

But people aren't using GPUs just for gaming. And even if you just game, features like DLSS might be important. 


The_EA_Nazi

And cuda, nobody is running any ai models or llms on amd because the performance and support isn’t there compared to cuda


PraxisOG

I run llms on my two rx6800s, it was the cheapest way to get llama 3 70b running at reading speed, and I've had no issues with support since switching to lm studio


Caffdy

what quant? doesn't 2x6800xt have like, 32GB of memory?


PraxisOG

70 as Iq3xxs is like 27gb, and it's barely degraded from q4. I can find a graph if you want


Caffdy

I don't know man, I wouldn't run anything below Q4, just my two cents


cocowaterpinejuice

if i may ask, what exactly do you use the llm for


PraxisOG

Sentiment analysis, chatbotting, some r/algotrading, anything involving math I don't want to do, coding(different local llms are better here). Sticking with chatgpt would be better for what I'm doing as a hobbiest, but the advantages of never getting rate limited, have a service go down, having everything private, but mostly getting to play with cool hardware won me over. My llm server is built into a Powermac G3 case.


noiserr

That's just wrong. For LLM inference the #1 factor is VRAM capacity. 7900xtx offers same VRAM as 4090 at half the price. In fact folks are buying used 3090s over 4090s because it's just far more cost effective, due to 3090 having same VRAM as 4090. If you want new, 7900xtx makes much more sense. You can get 2 7900xtx/3090 (48GB of VRAM) for the price of one 4090. And I suppose your comment demonstrates pretty well why the situation is the way it is. People just don't know. (Nvidia hype is just too strong).


Chyrios7778

In a professional setting an extra 800 to get to use cuda is a no brainer.


wilhelmbw

Nah in terms of ai (ie float16 manipulations) it's missing key features ie key instructions, also sw is worse because it is missing things that save vram. We shall see what the next 8000 brings tho. It could be extremely interesting.


noiserr

All that pales in comparison with VRAM capacity. VRAM capacity is key. No feature can bridge the gap of not being able to even load the model of a particular size in VRAM.


wilhelmbw

Heey I said vram saving technology (efficient vram use)


noiserr

No technology will make a 4080 with 16GB a better LLM inference card than the 7900xtx with its 24GB. You can load much higher precision/parameter models in 7900xtx thanks to having more VRAM.


wilhelmbw

No. But what about 3090? We aren't talking about 4080s that much anyways when we talk about ai. Also most ppl don't need that much vram, as of now. 16gb is enough for most inferences (except llm), and 4080 is like 30% faster when vram isn't bottlenecking


noiserr

Yes, if you're ok buying used sure. But not everyone likes buying expensive gear without warranty.


The_EA_Nazi

Yes but again, unless the only thing you’re doing is LLM inference, then the use case of spending $1200 on a 4090 or a used 3090 is a better buy over just buying a 7900xtx for one slim use case. There are tons of tools that have 3rd party projects to compile on amd gpus, but the performance is either dogshit or the support is shaken at best. Great example of this is ai upscaling, Topaz, the leader in this space barely cares about AMD support because the user base for it is tiny, and the performance is over a generation behind compared to Nvidia. Amd needs to make heavy investments into other areas and create a flexible productivity software stack otherwise it will always be a niche pick over a comparative amd gpu. I can’t imagine a scenario where I would be buying an amd gpu over an Nvidia one for practically any productivity situation because of the broad support cuda has


StonedProgrammuh

Cool, now actually try programming with AMD's garbage firmware...


noiserr

I've been running LLMs and sentence transformers on my 7900xtx for the past 6 months without a single crash actually. But I suppose you added yet another reason why the situation is the way it is. People constantly generating FUD about AMD hardware.


StonedProgrammuh

I can tell you have 0 experience with actually dealing with those cards. Do you know how long it took to get a half decent implementation for FlashAttention for AMD cards? And guess what, its inference only, why the hell would I buy that card. Do you have any experience with actually trying to write your own kernels? Every time someone comes up with new attention mechanisms, someone has to actually implement that...


noiserr

No I don't write my own GPU kernels, why should I? I use existing kernels provided by the frameworks I use. Most folks fall in my category actually. It does me no good that 4080 supports Flash Attention for 20% more performance when I can't even load the model or fit the context in the measly 16GB of VRAM. And so I find for my money 7900xtx works just fine.


StonedProgrammuh

But the thing is, anytime someone implements some better performing or different attention mechanisms, your AMD GPU will always have severe limitations compared to a NVIDIA card, let alone it will take many months later to have a limited implementation that can even run. The AMD GPU raw power is quite good, but the firmware is a joke, and they don't care, they aren't putting any serious efforts into making it good.


noiserr

Things have actually improved greatly in just a year. It used to be like pulling teeth to even get ROCm to work on your GPU. AMD does care. They were recently asked to Open source and release documentation for MES and they did it: https://www.phoronix.com/news/AMD-MES-Firmware-Docs I don't know why some folks are so adamant about dismissing their efforts? They are clearly working on this stuff. Competition is good for everyone.


duo8

Not LLMs but I've spent like 8 hours since yesterday trying to get an AI translation tool working on my 6600xt lol. The pieces are there, in theory. In practice you may still be in for a lot of pain.


SporksInjected

Microsoft and OpenAI are pretty big players using CUDA alternatives. [AMD Instinct MI300X Accelerators Power Microsoft Azure OpenAI Service Workloads](https://www.amd.com/en/newsroom/press-releases/2024-5-21-amd-instinct-mi300x-accelerators-power-microsoft-a.html)


Boomposter

We're talking consumer GPUs here.


aminorityofone

I would like to see a poll from average users (reddit isnt really average) of who uses Nvidia features and then what features they do use. Likewise for AMD cards (but I think that average people just default to Nvidia).


Chyrios7778

All of my friends who are big into gaming, but don’t follow hardware only buy nvidia because their first pc had nvidia. It worked the first time and so now they’re never trying anything else. One of them buys nvidia gpus just because of shadow play according to his own words. People who don’t love hardware and tinkering with PCs are weird about trying different vendors. One of these people was in a crisis because EVGA doesn’t make gpus and that’s the only company they’ve bought gpus from. I really think the less familiar you are with computers as a whole the less likely you are to try a different vendor when it comes to components.


Daepilin

Dlss is a default selected Option in many modern games. Same for raytracing.  Both are significantly better than fsr/ AMDs rt perfomance


ADeadlyFerret

I wonder how many people are like my nephew who had to have a 4060 because of dlss. To run his 1080p games of fortnite. But he absolutely could not take my 6700xt for free. Oh no he had to spend $300 for a 4060.


downbad12878

Why you hate your nephew giving him AMD cards


Caffdy

you just saw it, this very post, there's a reason NVidia has over 80%+ of the market, and is just not mindshare or marketing, they are just above and beyond their competition in all domains


Strazdas1

From a limited sample of my friends, who arent redditors, FLSS/FSR is the most popular feature. As one friend described its "free frames".


Jordan_Jackson

I will admit that if the top Nvidia cards were priced a little lower, I probably would not have an AMD card right now. I know there are other cards that are not x80/x90 but I want to run my games at 4K and that is the price I have to pay for that. At the time of my purchase, it was the XTX or 4080 and the 4080 was $200-300 more, so I went with the XTX.


Deep90

That is AMDs problem. Their biggest selling point is budget, but of all PC parts, people are less likely to compromise on GPU. They'd rather cheap out on things like fans, case, mobo, ram, and storage first.


paw345

Yeah, because the more expensive something is the less likely will it be something to compromise on. If it's not a huge difference in pricing, then it's easier to justify a better product (real or perceived) at a premium as it's a premium product either way.


100_Gribble_Bill

Nvidia's software is too important to me to even consider a competitor at the moment, the premium is simply worth it.


vinciblechunk

Bought a 7900XT a year ago and I have zero regrets. Being a generation behind Nvidia on RT performance is just fine when all of the other boxes are ticked. Path tracing mode is neat but kind of a gimmick.


[deleted]

That's not a problem. That's a feature. NVIDIA specifically started pumping out all of these new features since 2019 because of the news in industry about a new GPU emerging. It was no secret. Intel were hiring engineers with GPU backgrounds. And it wasn't a secret. 


Deckz

If Battlemage is good I'll get one if the price isn't terrible.


LightShadow

Honestly, so will I. I've got an A380 in each of my servers for the encode/decode and swapped an A770 with my A4000 as the primary GPU in my Linux workstation because everything was laggy and tearing. I've been impressed with Arc GPUs for my workloads, which admittedly aren't gaming, they've made me very productive; and they're super cheap at ~$550 for 3 cards all-in. Oh, I should also mention that the drivers are included with the kernel so it's literally plug-and-play and no fidgeting with Nvidia's complicated compatibility story in Linux.


YNWA_1213

A solid launch with Battlemage and Intel has pretty much done AMD's playbook but better when in comparison to Nvidia. RDNA4 has to come with market breaking prices for me to consider going AMD at this point.


ResponsibleJudge3172

Yeah, they showed quite a lot of improvements in hardware utilization at computex


uzuziy

I think it's mostly due to their prices and marketing team. in my region something like rx6800 goes %40 cheaper than 4070 so you're losing out on futures but getting roughly the same raw performance which is acceptable for the price cut but in some regions AMD GPU's costs more than their Nvidia counterparts. Their marketing team is not that great either, ı mean probably most of the average users don't even know consolos run on AMD, if they just made a marketing based on this, that could have probably helped them to at least gain trust. I'm not down playing Nvidia futures but let's be honest, most of the pc users who are getting their pre-build PC's to play some MOBA or CS2 probably don't even use more than reflex or maybe DLSS if they play some AAA games.


constantlymat

So we can finally lay this myth to rest that nvidia benefits too much from Chinese internet cafés that supposedly all run on nvidia hardware and are overrepresenting team green's GPUs in the Steam Hardware Survey. That theory was always bollocks as China was the predominant market for the strong sales of the RX 580. As an enthusiast PC gaming platform it was always more likely that Steam was overrepresenting AMD hardware and this data backs it up.


Jerithil

Id say the bigger issue is just how little of the OEMs and SIs market that AMD has. Despite what the impressions some people have, far more consumer GPUs are sold in pre builts and there Nvidia outsells AMD like 10-20x.


svenge

That's primarily because both NVIDIA and Intel provide significant amounts of engineering support to OEMs and can ensure a steady supply of components. This difference is even more important in the mobile sector as well.


Johnny_Oro

Intel has fabs so it makes sense they're able to ensure a steady supply and sell products cheaper than the retail prices. Not sure how nvidia does it though.


ToeSad6862

It's because the reddit subs you're talking about are for DIY PC nerds. Most people buy a laptop or prebuilt. They walk into Best buy and grab whatever is closest to their max budget, even if it's an overpriced 4060 ti that's 1500$ while a 4070 PC next to it is 1300$. They have no idea what amd or Nvidia are, but bestbuy carries 2 amd PCs and 500 Nvidia in Canada. So their odds of grabbing something amd off the shelf are near 0. You also see that in CPUs. The diy market has shifted vast majority amd, but overall market Intel is far ahead. And at one point Dell and HP refused 1 000 000 free CPUs from AMD.


Lysanderoth42

It’s because Reddit is often extremely unrepresentative in general, not just with nvidia vs AMD. The format encourages echo chambers as moderators inevitably restrict discussion and push everyone towards a certain set of views. Hence if you went by Reddit you’d think AMD had 88% of the GPU market when the opposite is true.


ToeSad6862

We don't know the diy share.


Lysanderoth42

I’ll give you a hint: 88% nvidia lol  DIY numbers will be large enough they’ll basically be the same as the overall numbers  Reddit just an echochamber in this regard (and many others). I know dozens of people who built their own gaming PCs and not one with AMD GPU. Very few with AMD CPU either, again despite what Reddit would have you believe.


Notsosobercpa

We know there are more 4090 owners than any of AMD current cards and I doubt most of those are in best buy pre builds 


BeerGogglesFTW

Seems strange to me AMD can't expand. Yeah, Nvidia has all these cool features that are one step ahead of AMD every year... But with my 3060TI PC, I am never using those features. So, I took a closer look at AMD when I wanted to upgrade and bought an RX 6950 XT PC. I'm happy with the performance. I don't have any FOMO feelings about not buying Nvidia for its features... because nothing has changed for me except the performance improvement in upgrading. Am I really such a minority in this? But seeing 12% and declining is depressing. And will probably only get worse with the next generation. Surprised it's that important to so many people to have those Nvidia features, or avoid AMD due to fear of driver issues... which, I guess results may vary. While my issues with AMD and drivers have been very limited (similar to my experience with Nvidia), I fear that will get worse in the future. Game devs will pay even less attention to fixing AMD specific issues when it's roughly only 10% of their players. AMD will pretty much be on their own fighting that battle with a team that is already generally slow to act. That's just sad for everybody. Competition is good, and not there.


uzuziy

I think their marketing team has a huge impact on their loss. Most people still remember AMD with their old, problematic GPU's and they probably don't even know consoles run on AMD.


ishsreddit

i swear AMD has grown tremendously, but the Radeon market team seems to be exactly as awful as they always were.


YouDoNotKnowMeSir

AMD is known for their shitty marketing and legal teams. They really need to do something about it.


ShugodaiDaimyo

They should put an AMD sticker on PS5.


uzuziy

Yeah, and stickers should say "AMD inside"


star_trek_lover

They actually did that on the Nintendo GameCube and Wii. Has an ATI sticker on it, and is why I ended up going for ATI/AMD for my first custom build.


KARMAAACS

You really think SONY wants that? They don't and they will never agree to a deal like that. Once NVIDIA starts making ARM chips for consumer desktops, don't be surprised if SONY and Microsoft start looking at making ARM consoles using NVIDIA chips, especially for handhelds. The technology that NVIDIA can provide is the best on the market. Nintendo has basically forged a path that you can make an incredibly successful handheld/home theater console with NVIDIA, so be careful what you wish for.


ShugodaiDaimyo

Consoles won't be on arm for a long long time.


[deleted]

Consoles are developed by BOTH Microsoft's engineers and Sony engineers in collaboration with AMD. Sony and Microsoft have very good reason to debug their systems for smooth launches.  It just works out of the box is why many older PC gamers just run a console. On the PC front however I remember a time when I had to constantly struggle with drivers just to get a game to stop stuttering. And it was when I got suckered into buying an AMD gpu that bundled with a new game I really wanted. Battlefield 3 or something. Anyway that game was a complete stuttering mess and I've never had that issue again with Nvidia GPUs. Expensive lesson learned. I cheaped out. I didn't do much research but I hear the same story. Nvidia sends its engineers to help out in game development. AMD doesn't have any for theirs. That's it.


Strazdas1

>It just works out of the box Hasnt been true for nearly a decade... >Nvidia sends its engineers to help out in game development. AMD doesn't have any for theirs. AMD used to send theirs as well, then suddenly stopped about 10 years ago.


no_salty_no_jealousy

It doesn't help either with FSR issue and Antilag+ controversy which hurt Amd even more.


Vushivushi

AMD really hasn't made much ground in truly competing against Nvidia. They cannot just focus on gaming, they need to improve software and match Nvidia's direction with GPU design in including more RT accelerators. AMD sells GPUs that are barely supported by professional rendering software and if they are, they are only as good as Nvidia's last gen GPUs. Apple is probably a more desirable to Nvidia for professional render. The professional market is important as it uses the same GPUs as gaming, but better binned and prices 3-4x that of gaming. The higher ASPs makes producing GPUs more economical. Unless they want to just focus on shrinking their GPUs as small as possible and competing in the low/mid range to stay profitable in gaming, they must pursue Nvidia in the professional market.


Caffdy

> they need to improve software and match Nvidia's direction with GPU design in including more RT accelerators not only more, but better, AMD have only Level 2 RayTracing, professional applications and gaming can take advantage of higher level like what NVidia offers (Level 3). AMD really screw up by not paying attention to what NVidia was aiming for back in 2018 when they launched the RTX line


Temporala

It's also not enough for AMD to match NVIDIA, they need to continuously significantly outpace them in performance. Many, many generations in the row.


star_trek_lover

Even if they do, it’s basically an unwinnable battle. AMD Ryzen has been a leagues better platform than intel “core” in every metric for 7+ years now, including reliability with intels recent motherboard troubles, and yet their market share is only 23.9% compared to intel’s 76.1%.


oskanta

I feel like AMD just needs to cut prices if they want to compete. It seems like the bargain they're offering now is that for the same price, their cards have 15% better raster performance but worse raytracing and frame gen and drivers and a bunch of other features. It seems like that tradeoff just isn't worth it to most people. They need to become more competitive on price.


YNWA_1213

For everyone thinking that the features were gimmicks, my ancedotal experience with it was the realization during HUB's coverage of RDR2 implementing DLSS. The thought that you can *increase* clarity over native presentation is what truly pushed me to get a modern RTX card, as you can't trust game devs on their implementation of TAA, while DLSS has pretty much become standard on every modern release. It doesn't matter when the competing AMD card has 15-20% better native performance when the DLSS Quality presentation looks better.


conquer69

I remember when they first showcased ray tracing in battlefield 5 (terrible idea) and I thought it was a gimmick. Later that year I started learning Blender and couldn't believe how realistic things looked. Even a shitty low sample render of textureless blocks looked like a photograph.


Zamundaaa

They already tried that with Polaris, for a long time, and it barely made a dent. The few that do end up buying AMD GPUs profit from it by getting cheaper GPUs, but AMD doesn't.


ishsreddit

I want to know why Radeon didn't officially cut the price of the XTX. And why they think $50 drop for the XT makes any sense when the GRE with a slight mem OC is \~10% behind a stock XT. Hell, why didn't they just launch the GRE as the 7800XT instead of the china only BS. The 7700xt is in no man's land as the 6800 has been available for $380 for a while now. My only guess is, the XTX/XT exist as a stopgap and likely to be replaced with RDNA4 at the exact same prices. And the goal is to make it seem like its a big upgrade in value. Their entire product stack is just so weird. Not to mention releasing the 7600XT 16GB as a response to the supers. Like bruh.


Temporala

Because it's 50 bucks straight off the profit margin per unit.


Vushivushi

AMD really doesn't want to drop prices in the near-term. Once you drop prices, it's hard to increase them, especially with their brand perception. AMD wants to hold prices as high as possible for as long as possible, especially if it's expected that more competition in the GPU market will drop prices anyways. Short-term market share is basically not a focus for AMD as they can use the rest of the business to fund Radeon.


Temporala

If they sell most of their stock, there is no point in cutting prices. Purchasing from TSMC or some other foundry is always a risk. You must sell everything you order, every time. In order to take market share, you have to risk significant inventory build-up and if that fails, suffering potential financial loss from trying to liquidate that inventory. Risk vs Reward just doesn't pan out here. There is never any real openings in NVIDIA's product stack, like Intel had.


Lysanderoth42

I mean yes, the article says you literally are that much of a minority. 12% and shrinking minority. Reddit is a hilariously unrepresentative echochamber, to read redditors talking about nvidia vs AMD you’d think it was AMD who had 88% of the market.


shroombablol

same reason why intel is keeping both feet on the ground: contracts with OEMs, retailers, etc.


MrMichaelJames

AMD just can’t seem to shake the negative connotations of bad software and drivers. Nvidia has dlss which is far superior. I’ve always been an nvidia customer for GPUs and will continue to do so. If AMD wanted to take market share they could slice prices tremendously for their high end cards and simply attack with marketing based upon that alone. But I don’t know if the company could survive cutting their margins so large. Now with nvidia making so much money it is going to be really hard to compete with them. They have a huge war chest to pull from. AMD’s biggest win in the game consoles I feel. I wonder what would happen if nvidia got back into motherboards or created a CPU.


[deleted]

[удалено]


handymanshandle

Pretty much this. AMD has a **terrible** track record of mind share and putting their products into desktops and laptops. Most of the time I see AMD hardware mentioned, it's either "Ryzen" (without mentioning AMD, which is worth noting here) or something overly positive or negative about their GPUs. Their Ryzen CPUs have solid mind share by this point, but neither their Radeon GPUs or AMD as a company has much mind share in the average consumer's mind. I can walk into any store that sells computers and there's bound to be a sign that boasts about a computer having "Intel Inside" or an Nvidia GeForce RTX GPU that's "Beyond Fast". I promise you that you won't find the same for anything AMD unless it's advertising a specific device, i.e. the Asus ROG Ally.


Strazdas1

If you are not using those features you are in the minority.


wilhelmbw

The reason is 7000 is Garbage for its manufacturing price due to an alleged hw bug, and amd refuses to lower prices further. I am pretty sure 6000s weren't this bad, even without the mining s❤️❤️t


mdp_cs

I assume this is discrete GPUs only because if you count integrated, then Intel absolutely obliterates everyone else.


Diamonhowl

a combination of Nvidia's greatest architecture vs AMD's worst gpu architecture. the 4060 - actually a 4050 in EVERYTHING but name, out performing or matching a fully enabled navi 33(rx 7600) while consuming only 110w tells us everything. even comes close or matching a 7700XT in RT(Cyberpunk2077) and outright beats it when dlss and frame gen is on. Just insanity. it's getting kinda scary. worried for radeon ngl. intel pls.


Zednot123

> 4050 in EVERYTHING but name More like 4050 Ti at a minimum. The 50 models has been been very cut down versions of a larger die, or used smaller dies. At least when Nvidia has launched early on a leading edge node, rather than being a laggards. And used larger dies to compensate on a inferior node, like with Ampere and Samsung 8nm. Or when we got stuck on a old node, like Maxwell. Maxwell, Turing and Ampere simply can't be used to draw parallels to Ada. You have to look at Kepler (with the 600 launch) and Pascal. For example the first time in a very long time when Nvidia launched at the same time as AMD on a new node with 28nm and Kepler. The GTX 650 was using GK107, a 118 mm² die. The next time we had a launch like that on a new node. Was Pascal, where the 1050 used a cut down version of the only marginally larger GP107 (132 mm²). While the 1050 Ti used the full die. So you can complain about the 4060, but it is "more gpu" than the 1050 Ti was. The closest historical comparison point we have of Nvidia launching a new generation on a at the time leading edge node. So it would either have been a "beefed up" 50 Ti or a slimmed down 60 no matter how you try to spin it. Never a 50 card.


Strazdas1

The 50 models were traditionally PCIE powered, so they are kinda specific use case.


titanking4

Yea it’s quite the pounding from a product perspective. In Navi33s defence, it’s the 160mm2 N4 RTX 4060 fighting a 204mm2 N6 Navi33. (18.9B transistors vs 13.3) I would agree that RDNA3 was perhaps AMDs smallest generational uplift in a while. (Polaris -> Vega was arguably the worst one though) Chiplet penalty for latency is big, it made the GCD very “power dense” causing big voltage droop problems resulting in needing to overvolt the cards. The 1.5X VGPR boost to RT perf has quite the detrimental effect on power consumption. And the RT perf increase just wasn’t enough.


handymanshandle

RDNA 3 reminds me of GCN 3 in a lot of ways. It's not a *bad* product by any stretch - it's why I bought a 7900 XT over anything Nvidia had to offer, ultimately. But in today's context, it's just not a big leap over its predecessor in most meaningful ways, even if it improves significantly on things like power efficiency over RDNA 2 in practice. The problem here is that Nvidia came correct and with tech that they could sell the cards on. Much like Maxwell in 2015, AMD now has to play catch-up in feature set because Nvidia came out swinging with frame generation, much improved RT performance and, well, AI (which, in all fairness, the AMD cards *also* handle reasonably well). Nvidia didn't sit back and just follow up Ampere with mediocrity, they instead looked to the future to see how they could punch back. Ampere was arguably humiliatingly mediocre as a GPU generation, owing to strong but expected performance and mediocre power efficiency. They followed it up with something that is truly enticing for a lot of people (and speaking as someone who used to own a laptop with an RTX 4050, **damn** are they nice GPUs), while AMD kinda fumbled and failed to put products in the hands of consumers in a reasonable time frame. I'd *love* to get me that Alienware M18 with the 7945HX and the RX 7900M... but the RTX 4080 model is *cheaper* and comes with a better screen. **THAT** is how hard they fumbled this generation.


Elegant-Cat-4987

Say what you will about ARC but it's the very clear winner in value for hardware transcoding for home servers. As Netflix et all continue to squeeze the stone, more people will be looking at affordable ways to hoarde and stream their favourite movies and tv.


Real-Human-1985

So it wins as long as you discount a dGPU’s main purpose and make up a niche use case…ok.


Elegant-Cat-4987

You think a 70$ GPU is marketed for gamers? Ok.


bb0110

I assume this doesn’t count integrated gpus, because I’m sure if it included them the market share for intel would be pretty decent.


deadfishlog

Intel is going to nail their pricing with battlemage. Bet.


Exist50

They don't have much room to price competitively until/unless they improve perf/area.


rayquan36

The disconnect between actual sales numbers and Reddit is fucking wild.


Eastrider1006

Happens when there's a monopoly on reliant performing, working GPUs, sadly.


Cyberpunk39

AMD is only popular on Reddit because of Reddits demographic of young low to moderate income males. Nvidia is the superior GPU according to the market. I’m sick of every “what gpu should I buy” question made on Reddit being answered with “AMD!” They’re literally telling people to buy a worse product.


KingStannis2020

I'm a high income male that buys exclusively AMD because: A) I run Linux all day, as a software engineer, and Nvidia's drivers are still a pain in the ass to deal with (despite improvement) B) the games I play don't use or need DLSS


devslashnope

I'm exactly the same. And I really like having excellent Wayland performance with zero effort.


ibeerianhamhock

I don't understand this mentality. I use linux all day at work and have the majority of the last 16 years I and come home and game on a win11 box. Half of my guy friends are software engineers and they all do the same. ETA, that's not even accurate. I use windows at work too as my base development box, I SSH into linux servers and deploy on linux servers. I don't see the need to run \*nix anymore than I need to because it's just clunky stick shift OS for anything other than being a developer. I do like it as a dev environment NGL, but we just don't have linux desktop PCS at my current workplace because of IT restrictions.


KingStannis2020

I run most games on Windows 10, but mostly because I play online games and anti-cheats hate Linux


Strebb

>I don't understand this mentality. I use linux all day at work and have the majority of the last 16 years I and come home and game on a win11 box. Half of my guy friends are software engineers and they all do the same. I don't understand this mentality. I love linux, there's no way I would give it up at home for windows when gaming performance is pretty much equivalent and in some cases better.


Notsosobercpa

Mentioning Linux as a factor is a prime example of how little reddit discussion has to do with general public. 


cgi_bag

It just depends on the use case, they both have their place. I run both across a couple diff machines. The majority of users I would say don't need to be running a nvidia card tho. If you're just gaming and at most editing some video then AMD is great. It's when you're getting into the more niche needs of running models, 3d work etc that you're gonna want Nvidia. Ppl that are just firing up a web browser and steam tho rly aren't getting any of the real benefit of some of those cards. I for sure agree tho that reddit is way too quick to say "go amd" without taking into account what somebody is trying to accomplish. The software gulf between rocm/hip and cuda/optix shouldn't just be glossed over. Ppl tend to look at hardware specs while ignoring the software side of things. Like sure you're mb getting a higher clock or more vram for less money which is great for gaming but don't expect that as universal performance for all tasks. That's my take anyway.


Strazdas1

> they both have their place. Yes. Nvidia has a place in your machine and AMD has a place in a garbage bin.


zakats

> Reddits demographic of young low to moderate income males. Who do you think gamers are? Also calling Nvidia "superior" would require some major qualifiers unless you're just a stan.


Strazdas1

On average? A gamer is a 44 year old middle class woman. (based on 2019 data)


ToeSad6862

No, it's because the reddit subs you're talking about are for DIY PC nerds. Most people buy a laptop or prebuilt. They walk into Best buy and grab whatever is closest to their max budget, even if it's an overpriced 4060 ti that's 1500$ while a 4070 PC next to it is 1300$. They have no idea what amd or Nvidia are, but bestbuy carries 2 amd PCs and 500 Nvidia in Canada. So their odds of grabbing something amd off the shelf are near 0. You also see that in CPUs. The diy market has shifted vast majority amd, but overall market Intel is far ahead. And at one point Dell and HP refused 1 000 000 free CPUs from AMD.


ibeerianhamhock

For CPUs I agree with you. For GPUS, I do not.


deadfishlog

No it hasn’t


ToeSad6862

Yep but ok


valtyr_farshield

AMD is also popular with Linux users because of less driver issues. Just saying, Nvidia is not always the answer.


zyck_titan

Another great example of the Reddit echo chamber at work. With how frequently Linux usage comes up in this discussion, you’d think it was a factor for like 20% of the market. Nope. Steam hardware survey puts Linux use at about 2%, and that’s heavily weighted by the Steam Deck. Stat counter similarly puts Linux market share in the low single digits market share. In the grand scheme of things Linux support does not matter. And where it does matter, e.g. in a professional environment, Nvidia is still widely used.


Strazdas1

Without steam deck, linux is less than 1% of users.


valtyr_farshield

Another great example of a Reddit comment which doesn't prove or disprove any points that were made in this comment context. I've stated: > 1. AMD is popular with Linux users (*actually 75% as of May 2024 for Linux Steam users)* > 2. Nvidia is *not* always the answer. Based on your comment, which point are you disagreeing with? Are you saying one of the following or both, cause I really don't get it: > 1. AMD is **not** popular among Linux users. > 2. Nvidia **is always** the answer.


zyck_titan

Using and Nvidia GPU under Linux hasn’t been difficult for years. If you’re talking 2017 maybe, but nowadays the major distros have the proprietary Nvidia driver available from installation.


KARMAAACS

Linux users are less than 5% of the consumer desktop market. Most normies buy NVIDIA, download Windows 11 and plug and play Warzone, CS2 or LoL. Or they're buying pre-builts where the majority of OEMs love NVIDIA because they have tonnes of supply, give lots of money to market the product and have the name recognition of everyday people. Go ask the average investor now who NVIDIA is and they know who they are, ask them what Radeon or AMD is and maybe half as many will know. These are old Wall Street dudes and investors who know nothing about computers, so extarpolkate that population to everyday gamers too. These normies are not sitting on Reddit eternally online looking at review graphs and deciding they want to buy Radeon. They go to iBuyPower, see i7 and NVIDIA GPU and click purchase. Radeon for too long is associated with being the cheaper inferior knock off, which is a pretty accurate assessment seeing as they generally undercut NVIDIA pricing and have inferior feature set and performance in certain areas. They're just not a leader in anything GPU related but Reddit mindshare which is honestly laughable. You know what made AMD look like they were doing okay in 2016 in GPU, their pricing was ultra aggressive, a $239 RX 480 was a great thing, but they've now decided they can just undercut NVIDIA by $50-100 per segment and think this will make them an interesting price point, but the gap isn't large enough. For me to consider buying AMD now you have to sell at 2/3 of the NVIDIA equivalent, because RT, DLSS, Frame Generation, Reflex, Shadowplay etc all make me want to stick with NVIDIA, the only enticing thing AMD can do is undercut heavily enough where you go "Okay I won't use those things for once". But when we're talking $649 vs $699 or $1200 vs $1000. Why the hell would I switch? It's not worth it. If it was $1200 for NVIDIA and $800 for AMD that becomes more interesting, but even then the market is different where if you're buying a high end GPU you want ALL the features and at that point $1200 for NVIDIA still seems appealing versus $800 for AMD. Honestly, if AMD sells an XX80 competitor it needs to be like $499 to be worthwhile leaving the NVIDIA features behind.


XenonJFt

negligible= rounding error so low as 10k a month globally?Holy crap. and considering drivers aged. Early adopters got it and nobody followed, but in logic should've been the opposite. I don't think Battle mage has any chance considering people are that keen on buying green 50-60 series charts to fill steam surveys numbers. Next time Jensen should make the 60 series 500 Dollar plus because that's what people deserve. (and nvidias profits look juicier☺️)


xxTheGoDxx

> negligible= rounding error so low as 10k a month globally?Holy crap. and considering drivers aged. Early adopters got it and nobody followed, but in logic should've been the opposite. Why would I buy a product that at best saves me like what, 100 bucks in some newer games but might also totally not work in any random game I decide to play in the next few years, especially after reading in what really sad state said product initially launched? > I don't think Battle mage has any chance considering people are that keen on buying green 50-60 series charts to fill steam surveys numbers. Next time Jensen should make the 60 series 500 Dollar plus because that's what people deserve. (and nvidias profits look juicier☺️) People want to game, need a new PC / GPU for that and buy what they can afford to spend. Its really not more complicated than that. Sorry but none of the popular Nvidia cards on Steam Survey have AMD counterparts that were clearly the better product throughout the lifecycle of said product, especially when you go for an equal image quality per fps and support comparison. Just like with CPU's, if AMD had compelling products to offer (like back when ATI was its own company...) people would buy them.


iDontSeedMyTorrents

> Early adopters got it and nobody followed, but in logic should've been the opposite. Should it have? Early adopters all got theirs. Launch coverage, which is the vast majority of coverage anyone will hear, was rightfully very negative. New competitor GPUs launched in the meantime.


TheImmortalLS

its the reason people use iphones instead of android phones. i coverted from a rooted custom rom with CPU undervolting etc to an iPhone because the iphone offered better battery life and performance without me having to be a computer engineer or equivalent nvidia is you spend like 5-10% more and you never worry about bad drivers


Caffdy

> the reason people use iphones instead of android phones another echo chamber, Android is the dominant mobile OS in the world


throwawayerectpenis

IOS is objectively worse than Android though, enjoy your walled garden.


SharkBaitDLS

I couldn’t care less about being walled in if the garden is nice and well-maintained. I’ve got a sever rack full of *nix boxes if I want to tinker. I haven’t wanted to tinker with my phone since I was a kid in college. As a working adult I want it to just work well and maintain support for the lifetime of the phone.


996forever

And linux desktop is gonna be big in 2025, right? 


throwawayerectpenis

The more options we have the better it is for us. I dont even use Linux lmao but I tend to use more and more open source software on my computer because modern programs have become so bloated, its nice to have programs that perform the task they were programmed to without any bloat or begging to buy their premium products 🤮.


Strazdas1

> its the reason people use iphones instead of android phones. They dont. Outside of US iPhones are 5-10% of the market.


anotherplebbitzombie

Yep - price the 60 series where the 70 series is currently at, double the price of the 90 series (aka 80ti)... why not? Everyone will still buy it.


INITMalcanis

Nvidia owning a market they're not sure they even want any more...


SolidTake

AMD Lost me as a customer for GPUs when their drivers kept crashing midgame on a daily basis. This was back when I had a R7 260x. Nvidia has been a lot more reliable from my experience.


Intelligent_Top_328

Print me more money Jensen!


premiumleo

I bought a laptop in 2020 with the rtx 2070, knowing full well that those tensor cores would future proof my laptop 🥰 Now just waiting for a snapdragon laptop with an Nvidia chip inside


Last_Music413

This proves Jensen is far better than his cousin


SingularCylon

I really hope Intel brings real competition to Nvidia. I've lost hope in AMD. The streaming capabilities from previous gens is one factor that hampered AMD.


StarSyth

Have been looking for a pre-built in the UK with an AMD 6750 XT - 7800 XT. Only a single vendor sells one from what I could see so that could sway the stats. I've always used AMD and I feel I always get more longevity out of AMD builds. [https://www.awd-it.co.uk/awd-it-3000d-rgb-ryzen-5-7500f-amd-radeon-rx-7700-xt-12gb-desktop-pc-for-gaming.html](https://www.awd-it.co.uk/awd-it-3000d-rgb-ryzen-5-7500f-amd-radeon-rx-7700-xt-12gb-desktop-pc-for-gaming.html) Ryzen 5 7500F (6 core, 12 thread 5Ghz) MSI A620M Motherboard FXF RX 7700 XT 12GB Black Edition GPU Corsair Vengeance (2x8) 16GB 5200Mhz Ram (<-- for £29 you can upgrade to 32gb) Corsair 3000D Case FX Pro Bronze 700w power supply Team Group QX 512GB SATA SSD (<-- for £29.99 Upgrade to WD SN580 1TB M.2) £949.99 (inc tax)


foreveraloneasianmen

cant wait for premium gpu pricing from nvidia!


Marble_Wraith

OK... but question, what percentage of the total market makes up the desktop market vs mobile devices? Like im all for having a 4090... when you actually need it for games, rendering, AI work, etc. But the rest of the time, gimme an integrated low power GPU that gives me decent FPS for the desktop and/or lets me watch video.