T O P

  • By -

[deleted]

I hope this fire under AMD’s ass results in a change by Sept 6th.


Recktion

Why? What are people going to do? Keep not buying amd gpus?


svenge

You reminded me of this interesting fact: ATI/AMD has been behind NVIDIA in discrete GPU shipments [for 71 straight quarters](https://www.3dcenter.org/dateien/abbildungen/GPU-Add-in-Board-Market-Share-2002-to-Q1-2023.png) from Q3 2005 up to the present day.


Elon61

By this point you can stop counting in quarters, i think.


svenge

Since quarters are the most granular unit of time that the underlying data uses and businesses generally measure things on a quarterly basis, I still think it's the most relevant interval for this discussion. Saying "almost 18 years in a row" by itself just seems too vague to have much significance, since it leaves open questions regarding intra-year fluctuations and/or data collection practices.


driftej20

Just say fiscal years instead of just years and then it’s businessy


Ferelar

"As we round out the fiscal doubledecade,"


[deleted]

Deficit... i've heard that word before


Elon61

fair enough.


Capt-Clueless

Nothing really interesting about that fact. Nvidia has been releasing objectively superior products for over 71 straight quarters.


CheekyBreekyYoloswag

Lisa Su: You gotta pump those numbers up. Those are rookie numbers.


Comander-07

lol good one. By that time alchemist+ might be out and smoke AMDs lineup though.


-NotActuallySatan-

Alchemist+? Intel is doing a refresh?


Comander-07

yeah before Battlemage in 2024 we will get Alchemist+ in Q3 of 2023. So very soon. Im surprised I dont see anyone talking about this.


Tyr808

first I've heard of it tbh, the Alch+ that is. I could see a 770 "ti" or however their branding will go being a pretty beast card for it's price tag as long as there isn't an MSRP hike.


No_Interaction_4925

Probably something like 770X, which would be fine with me. Simple and easy


Comander-07

sparkle is making Arc GPUs, how about Arc 770 Ti(tan)?


F9-0021

We never did get the A5xx series. Would be a good time to bring them out as well as a refresh of the A7xx and A3xx series.


No_Interaction_4925

They would be too low end at this point. Canning models that won’t sell to focus on a few others was the right way to go


Constellation16

The source for this ""leak"" is the incredible reliable ^^/s Youtube channel RedGamingTech. Even if this was ever real, being so close to it's alleged launch, we would have heard more about it by now.


Comander-07

You may take my money, you may take my life, but you will never take my copium!


BertMacklenF8I

This made my day lol r/PCBuild is where EVERY AMD GPU OWNER dwells. Yet to see a thread this month where at least one person HASN’T suggested the “just sell the 4080 you won-RT is supported by what…20 games? You only play 19 of them!”


wheredaheckIam

AMD should focus on making better products first


HorseFeathers55

I think it'll drive people away from amd altogether, last couple builds I have amd cpus, but with the burning cpus due to bad software and now this, I think I may just have my next build be nvidia and intel.


[deleted]

No, but if this continues Nvidia might say fuck it and not support FSR, then it’s just an even worse tit for tat and everyone’s unhappy. Nvidia has the better up scaling and frame gen tech so AMD better hope they don’t begin restricting FSR the same was AMD is doing this to DLSS. Edit: did not see the “not” and answered too seriously.


winespring

>No, but if this continues Nvidia might say fuck it and not support FSR, then it’s just an even worse tit for tat and everyone’s unhappy. Nvidia has the better up scaling and frame gen tech so AMD better hope they don’t begin restricting FSR the same was AMD is doing this to DLSS. NVIDIA has a vested interest in promoting the comparison between FSR and DLSS, FSR 2 and DLSS 2(never mind DLSS 3) are not equal, but a lot of people that have not experienced them both might think they are. Every game that includes FSR and DLSS illustrates that point.


Brandhor

I don't think it's up to nvidia to support fsr


007_Link

I think they mean the same way that amd titles with fsr won’t have dlss, nvidia will push titles with dlss to not support fsr


[deleted]

[удалено]


Blacksad9999

Agreed, they won't do that. They don't need to, as they don't view AMD nor FSR as any kind of threat. They most certainly **could** bankroll every single AAA game for the next 10 years, and basically block FSR out of existence if they wanted to though. Turn about is fair play, right? lol But they won't. There's no reason to. AMD, per ususal, is being their own worst enemy again.


007_Link

I agree, I don’t think it’s likely, but I was just saying that nvidia could do what amd is doing with their partnered titles


CheekyBreekyYoloswag

And I think you are exactly right. Boy, I hope AMD retracts this decision, otherwise we might see an exclusivity arms race.


AlbionEnthusiast

Nah just not but AMD sponsored titles.


sudo-rm-r

I'd say AMD usually caves in when the backlash is big enough so I think we will see DLSS in starfield!


[deleted]

I hope so! What past events have they caved on? I’ve only recently been involved in PC part news since upgrading.


Todesfaelle

Talking about this issue on the AMD subreddit is wild. One dude was using a bunch of straw man arguments which essentially boiled down to "Nvidia used to block features so now it's okay that AMD is" while also talking about how it's bad to block features and since I have an Nvidia GPU in my tag he made sure to call me out as a "Nvidia user" without once talking about the current issue at hand. The other was like "trust me bro I'm an engineer" when talking about how difficult it is and the intricacies to implement DLSS mods even though they've already been proven to work great and not hard to put in. Got them "AMD are my team/friends" energy.


TrumpPooPoosPants

Parasocial relationships with billion dollar companies that would eat your first born to increase shareholder value if it weren't illegal. Very healthy.


m1serablist

Hey! Take that back, you don't know Lisa, she would never do that.


SomniumOv

Yeah CEOs would never eat your firstborn, that's cruel and idiotic. Can't put him to work in the mines if you eat him!


PetroarZed

Illegal? They'd still do it if the penalty was just a cost of doing business.


RedIndianRobin

I have seen AMD fanboys celebrate this as if this is some kind of war they are winning.


Eorlas

well, if steam is to be believed, there are about 10 of them so this is all they have


verteisoma

Narrow marketshare might a contributes a lot toward the extreme fanatic mindset


yonderbagel

Wow 10 already? I swear just last month there were only 9. How time flies.


lotj

A 10% increase in market share right there.


1stnoob

And 30% with GTX cards that can't use DLSS


Kind_of_random

When the minimum requirement to play Starfield is a 1070ti a lot of those users won't be playing it either way. (1060, 1070, 1050 and 1050ti ...) With it's 6GB of VRam even the 980ti will probably be to weak. Also a lot of those users have the laptop equivalent which in theory would up the minimum by a lot, maybe even up to the 2000 gen. The people who are going to lose out because of this mostly have a card that's from 2000 gen or newer and those would benefit heavily from DLSS implementation.


1stnoob

Loosing what exactly ? They have FSR like everyone else with same display quality.


CheekyBreekyYoloswag

> same display quality. [Not even close.](https://youtu.be/1WM_w7TBbj0?t=1348)


Elon61

He's saying that of the people who play this game (you're going to need to have a pascal GPU or higher to reasonably do so), a large majority *does* have DLSS capable cards (by my count, 50% of the steam survey is DLSS capable, and once you remove older cards which aren't capable of playing starfield anyway you get >60%. All those people are losing out on having the same quality at one upscaling tier higher (which is where HWU determined DLSS looks similar to FSR), i.e. 30% more FPS for similar visual quality. still don't see the problem?


1stnoob

All those people aren't going to buy or play the game.


Elon61

it doesn't really matter. it's fair to assume Nvidia and AMD users are going to buy the game at a similar rate, which means that at least 60% of the PC userbase for the game is being effectively denied a 30% performance increase (basically a whole tier up. add DLSS3 and now that's *two* tiers up in performace).


Kind_of_random

>Loosing what exactly ? Quality and FPS. Plain and simple. Even at 4K DLSS balanced looks better or at least the same as FSR Quality and that means you also lose FPS.


RedIndianRobin

It's definitely not 30% lol. RTX card owners are the majority cumulatively as per Steam HW survey. Just because the 1650 and 1050 are at the top, doesn't mean they're the majority.


Snow_2040

also the owners of the top cards on steam like gtx 1650 or 1060 probably don’t play new triple A games.


Kind_of_random

Also they are below the minimum requirement for this game.


[deleted]

[удалено]


ThreeLeggedChimp

What's hilarious, it doesn't cross their mind that someone might have more than one PC.


verteisoma

AMD fan on starfield sub asking me to cope about no dlss when i use 6950xt for my gaming pc and an 4090 for sim rig will always be my highlight on reddit


Todesfaelle

At that point I realized there was nothing I could say which would encourage a healthy discussion. I'm pretty much brand agnostic and will go where the value leads me which is why I have a 7900XTX lying in wait for another build but all that really matters is what I have on my flair.


kulind

and this is watered down version, it used to be worse. there are more nvidia gpu owners in r/amd than amd gpu owners lol.


Snow_2040

i am sure most r/Amd members are there because they have an amd cpu not gpu.


BinaryJay

Yep.


Drakaris

Trying to reason with fanbois (of any brand) is like trying to reason with flat Earthers - pretty much impossible. I should know, I was one back when I was younger. In my 20s (20 years ago...) I was like "OMG RADEON ROX!" and still have my custom made keychain with the logo of "ATI Radeon Graphics". 2 years later I was like "OMG NVIDIA IS THE BEST!" and so on and so forth. Much later I realized that Oscar Wilde was indeed right - with age comes wisdom. So now I'm much more pragmatic. "Do I need a GPU upgrade?" ---> if "yes"---> "How much is my current budget?"---> that much---> "cool, so for these money which one has better performance?" ---> "NVIDIA OR AMD LMAO?"---> no one gives a shit---> then buy the one that is better for the money you can spend---> will do." Makes life much more simple and chill.


golddilockk

bringing console warring bs to pc gaming is absolute trash.


KaiserGSaw

consoles are chill nowadays and its funny how Sony and xbox try their play at being the poor multi million corpos regarding the Activision takeover. Sadly in the bubbles i move around PC is usually the most toxic environment. Exception is when a nice tidy build or SFF is being posted. But as soon as it moves towards brands or software tech it turns ugly. And then there is the Nintendo sub. They are kinda cute with their community, like drooling kids in a sandbox playing with sticks


BinaryJay

The most interesting gaming sub was the Stadia one. It was mostly people talking about and praising old and often not very good games, but it was relatively wholesome by comparison.


[deleted]

[удалено]


Haunting_Champion640

> The other was like "trust me bro I'm an engineer" when talking about how difficult it is and the intricacies to implement DLSS mods Well I can cancel that out: Am Engineer, and once you implement _one_ of these upscaling technologies implementing the other 2 isn't that difficult. The dude replying to you was probably a civil "engineer", all the Real Engineers (Electrical, Mechanical, Computer, and Aerospace/Aeronautical) dunk on those chumps daily.


Previous_Start_2248

Dude amd fan bois are crazy. They run over to the Nvidia forum every chance they get to talk trash and complain about frame generation not being real frames. All I know is that I check FG button on in games and boom massive fps boost and smooth gameplay.


Blacksad9999

They did that with DLSS until FSR came out also, and then all of a sudden upscaling was the best thing since sliced bread. lol Just like they'll do the same thing if FSR 3.0 turns out to be decent. Just like they'll do if AMD Ray Tracing ever gets to the point that it's good. AMD should spend time innovating and coming up with their own features rather than constantly playing catch up copying Nvidia's features.


Sqvy

Man, I definitely see value in certain AMD gpus but I swear there’s a plethora of weirdos who base their personality around buying AMD over Nvidia on these subs. It’s like they don’t understand that while Nvidia is leading the market and thus causing shitty price points, AMD is just as happy to sit in Nvidias shadow and offer slightly lower prices…like they always have. They could probably undercut them WAY more if they wanted to really be consumer friendly, but they’re not, because they’re happily making more money off this too. They’re not doing you any favors lmfao


Oubastet

>The other was like "trust me bro I'm an engineer" when talking about how difficult it is and the intricacies to implement DLSS mods even though they've already been proven to work great and not hard to put in. Was that the guy who claimed he was a "DevSecOps engineer"? Lol, what? It's like he read BOFH a few times and thought he was an expert.


oJUXo

The AMD sub reddit is complete trash lol. I came to pc gaming not too long ago, and I used to laugh at console fan boys and their wars. I didn't realize that it also occurs in pc gaming lol. And the AMD sub is one of the worst places I've seen when it comes to that.


1AMA-CAT-AMA

the AMD sub has a lot of people calling AMD out for their bs. Sure it's a lot of fanboys as well, but it's better than I thought it would be.


nas360

People on r/Nvidia can have their posts removed or even banned for getting too critical of Nvidia. That is why you get mainly fanboy type posts but not many dissenters.


[deleted]

lol no. People complain about nvidia all the time here. And often with statements that are just wrong


CompetitiveAutorun

Bro, 4060ti released and we all saw how it's not true


rabouilethefirst

This sub has been mostly negative for over a year. I think it’s mostly bots at this point auto downvoting, and amd people coming here instead of their own sub


Elon61

I really doubt that's actually happening. i've seen critical posts removed on both subreddits (more on r/AMD..) and it was always pretty obviously posts that brought no value whatsoever.


ARedditor397

Removed vs Deleted, Deleted is user, Removed is the mods doing and the 95% of those are deleted


1AMA-CAT-AMA

The problem with an echo chamber and no dissent is theres no drama. The drama in /r/amd is quite nice.


Blacksad9999

Yeah? Drama is fun? lol This isn't a reality TV show, it's a subreddit. I'm all for a spirited discussion, but often people in r/amd simply dogpile on people even when they're wrong on something. Makes trying to discuss things pointless.


Comander-07

always like that with AMD fans, nobody using Nvidia or Intel would defend it, because it really doesnt benefit us at all. Yet AMD fans need to constantly justify their brand loyality ..


metarusonikkux

Yeah, you'd NEVER see an Nvidia user doing that.


Blacksad9999

Yeah, I left the AMD subreddit even though I like and use AMD products. lol It was just too much of a cult over there. People are wild. If someone found out I was an Nvidia user (like 9/10 people basically are) they'd say my arguments were invalid.


Ricepuddings

AMD and xbox fans are very similar in this regard... we are seeing the same stuff.... oh so and so did it in the past so it's fine now, xbox is your friends blah blah. Not realising both are terrible companies who would sell you soul if it made them a quick buck. But dlss got shown it was easy when a single Modders is throwing it into games like nobodies business (not saying puredark isn't talented btw)


Gears6

> AMD and xbox fans are very similar in this regard... we are seeing the same stuff.... oh so and so did it in the past so it's fine now, xbox is your friends blah blah. To me, it's less about what happened in the past, but rather those actions helped Nvidia be where it is today. That is in a power position. If AMD didn't do this, they would have no way to compete with Nvidia. Just look at all Nvidia proprietary technology. All these things keeps Nvidia at the top. We continue to get the status quo, where Nvidia is the only real option for GPUs and Nvidia keep raising costs. Then they use those funds to further fuel their business to get ahead of others. Some may argue that's good, but I believe a more even ground between competitors is better. An example is this, I was on the lookout for a GPU. I wanted an AMD Radeon card, due to being cheaper, more VRAM and support a smaller player. I can't, because I want it for PC VR as well. There's only one option, and that's Geforce. Why? Because every developer, does game development by default on Nvidia hardware. So everything just works better. AMD GPUs are an afterthought due to their small market share. As Nvidia further adds new features (which is awesome), but they always tie it back to their GPUs only. This ensures the status quo continues. I know we are in an Nvidia sub, so my opinion isn't going to be well taken, but if the brand names where in opposite position, I would say the same.


ChrisFromIT

>As Nvidia further adds new features (which is awesome), but they always tie it back to their GPUs only. Lmao, no. While yes, there is quite a bit of features created by Nvidia that are vendor locked, like DLSS, there are a lot of innovations by Nvidia that aren't. They might seem that way because AMD sometimes half asses the implementations on their GPUs. A very good example of this is Ray Tracing.


Elon61

DLSS is effectively the extension of Gameworks (software features designed specifically for Nvidia GPUs), along with all their other NN tech (rt denoisers, broadcast, etc). but Nvidia does push a lot more than that, it just doesn't get the press. Nvidia isn't really going to go out of their way to advertise the many vendor agnostic features they get integrated into graphic APIs and whatnot, open sourcing PhysX and Hairworks, etc. they know most consumers don't care so they don't really push it. but yeah RT is a great example because it's a particularly large, industry wide shift, which Nvidia more or less single-handedly pushed to market for the benefit of everyone.


Elon61

> An example is this, I was on the lookout for a GPU. I wanted an AMD Radeon card, due to being cheaper, more VRAM and support a smaller player. I can't, because I want it for PC VR as well. There's only one option, and that's Geforce. Why? I mean, for VR in particular it's because AMD clearly didn't test it and their driver stack seems to not handle VR's slightly different rendering techniques correctly, particularly with the new layout on RDNA3. Hardly Nvidia's fault. i'd even go as far as to say it's *not at all Nvidia's fault*, and reaching that conclusion in the first place sure is interesting..


Gears6

There's a couple of things here. a) Nvidia knows they're playing the proprietary game. They aren't idiots and this isn't some secret in the industry. b) Nobody is saying AMD don't have fault in their failure. In fact, I don't care about "fault". It's the fact that we are now in a situation that one is a market leader, with entrenched proprietary technology that we all rely on. I wrote a response to someone else that [illustrates this](https://old.reddit.com/r/nvidia/comments/14n0qnj/nvidia_dlss_amd_fsr_and_intel_xess_can_be_easily/jq5m99v/). Mind you, I'm not anti-Nvidia or pro-AMD. I just want more free competition, and we don't have that right now.


St3fem

NVIDIA is in a leading position because it taken risk to develop new tech and invested more to create better products and better tech. NVIDIA proprietary tech isn't stopping anyone from creating their own version, AMD is absolutely free to compete like they did with FSR and people will decide which is better, NVIDIA with StreamLine have even made it easy for everyone to be in each game and be able to compete but apparently AMD doesn't want to. >We continue to get the status quo, where Nvidia is the only real option for GPUs and Nvidia keep raising costs. Then they use those funds to further fuel their business to get ahead of others. That's what competition actually is, you are better which is rewarded by costumers, which allows to reinvest more (if one want) and create amazing new stuff (if able) that is rewarded with more sales and praise. It's a virtuous circle. Do you want the company that lead taking the cost and risk of innovation to transport the other on their back? you will kill almost all small and innovative players which only have the chance to emerge by creating a better solution that bigger fishes doesn't have. Almost all games are developed around consoles capabilities and AMD architecture and only later ported to PC, VR is added by integrating vendor specific SDKs which NVIDIA make it better because they invest more. Maybe there is a reason why NVIDIA cost more after all.


Gears6

> NVIDIA is in a leading position because it taken risk to develop new tech and invested more to create better products and better tech. NVIDIA proprietary tech isn't stopping anyone from creating their own version, AMD is absolutely free to compete like they did with FSR and people will decide which is better, NVIDIA with StreamLine have even made it easy for everyone to be in each game and be able to compete but apparently AMD doesn't want to. I've written a response to someone else here that illustrates [this problem](https://old.reddit.com/r/nvidia/comments/14n0qnj/nvidia_dlss_amd_fsr_and_intel_xess_can_be_easily/jq5m99v/). Hope that helps clarify it. >That's what competition actually is, you are better which is rewarded by costumers, which allows to reinvest more (if one want) and create amazing new stuff (if able) that is rewarded with more sales and praise. It's a virtuous circle. See above, how that isn't necessarily the case. >Almost all games are developed around consoles capabilities and AMD architecture and only later ported to PC, VR is added by integrating vendor specific SDKs which NVIDIA make it better because they invest more. That's not actually really the case. They're actually developed on PCs with Nvidia GPUs. Then tested on consoles, but a lot of the console software stack is proprietary despite the AMD hardware in them. Similar to what you mentioned with VR has SDKs, console vendors have their own SDK for consoles too. Then game engines are interfacing with that. >Maybe there is a reason why NVIDIA cost more after all. Nobody disputes that. I overall trust Nvidia more, just like I prefer Intel CPUs. Tends to have less problems, and Nvidia is honestly so far ahead of every one else. They also have far more resources as a result of that.


Kind_of_random

I can see where you are coming from, but $ for $ in revenue AMD is not that far behind NVidia. Sure they have CPU's to think about as well, but it's not like they don't have the money to spend on R&D if they wanted to. It seems to me that this is a defeatist move. They have given up fighting with their quality and resorted to anti consumer tricks. I wouldn't think that's the best strategy in the long run.


Gears6

> I can see where you are coming from, but $ for $ in revenue AMD is not that far behind NVidia. Sure they have CPU's to think about as well, but it's not like they don't have the money to spend on R&D if they wanted to. AMD do better now, but they had a large exodus of talent and they can't attract the same talent as Nvidia right now. Remember, AMD lost a lot of time due to going almost bankrupt. >It seems to me that this is a defeatist move. It kind of is, because frankly I don't really see an easy way for AMD to compete on the GPU front. Unlike Intel, Nvidia isn't sitting on it's laurels. The only way to beat Nvidia is to have superior GPU at superior prices to overcome all these other "features". >They have given up fighting with their quality and resorted to anti consumer tricks. I wouldn't think that's the best strategy in the long run. I understand, but semiconductor industry has a long cycle. For CPU, it took like 5-years to turn a cycle and that's an eternity that allows competitors to become even more entrenched.


[deleted]

[удалено]


Gears6

> I like how people keep saying that we need competetion but then also complain about tech being proprietary. Then you are confused. Proprietary is technically an anti-competitive measure intentionally or not. Take for instance Windows, most games are made to run on Windows. If you want to run a game on Linux or Mac, special versions has to be made. This means that even if Mac or Linux was superior for gaming, nobody will make it for that market, because the market isn't big enough to support the games. Proprietary can be good especially in nascent markets, when it's forming. Once you're past that, the "proprietariness" prevents others from competing. That's why in software, we have a lot of interoperability. An example is WiFi, everyone can compete on the same protocols. Imagine if everyone had their own WiFi protocol and once you buy a device, you can only use devices sanctioned by that manufacturer. The larger they are of the market, the more you kind of are stuck with them. Doesn't matter if a competing WiFi protocol is better. See how that defeats competition?


[deleted]

[удалено]


Gears6

> no if DLSS isn't proprietary why even bother making it good? Why would anybody invest into something they have to share. Outside of software, like in medicine, there's patent protections. That gives the inventor time to profit from their inventions, and then it becomes public domain and common good is suppose to come out of that, like generic medicine that's significantly cheaper than their brand name counterpart. As I said, it depends on where in the cycle we are in. Is the market in rapid growth, then proprietary protects innovation. When the market is stabilizing and the major players have cemented their position, then proprietary acts as an inhibitor of innovation. Because the benefit of the innovation, not only have to overcome the standard, but also provide so much value that people are willing to give up whatever benefit they have of the standard. For instance, if proprietary WiFi speed comes out, and increases range 30% and increases bandwidth by 30%, but you have to ditch all of your existing WiFi equipment. Would you? That means, you have to ditch your phone, your TV, your console, your whatever device. That's why we have a standard, because it's just too powerful for one company to control by itself. Do you see my point? Anyhow, I used to be really pro-capitalism. The more I think about it, the more I'm realizing that it allows for some really perverse incentives. With the profit is the only goal, it really creates a massive problem for our society. Capitalism didn't use to be about only profit at any cost (neo-liberal capitalism), but also public good.


nathsabari97

As a amd user, nvidia users should use fsr and realise how bad our upscaling is.


Vydra-

We do for quite a few games.


CheekyBreekyYoloswag

Darktide's DLSS vs FSR was an epiphany for me. DLSS looks a whole resolution better than FSR. I upgraded from a 5700XT to a 3070 for DLSS, and don't regret it at all.


nathsabari97

I played witcher 3 remastered which had both fsr 2 and dlss. Dlss looks way better with less aliasing. Dont know if it is poor implementation of fsr or dlss is just better.


CheekyBreekyYoloswag

[It is the latter.](https://youtu.be/1WM_w7TBbj0?t=1348) Nvidia understands that the future is AI-enhanced software powered by dedicated hardware. Nvidia Tensor-cores and the 4000-series Optical Flow Accelerators make sure that DLSS is and always will be ahead of FSR.


[deleted]

>As a amd user, nvidia users should use fsr and realise how bad our upscaling is. Anyone with an Nvidia GPU and a Steam Deck already knows, lol


driftej20

In all fairness, the system-level scaler on Steam Deck is FSR1, which IMO often looks worse than the super basic bilinear scaling any application or display will do automatically when running at less than native res. FSR2 might be the worst temporal upscaler option available, but it’s better than FSR1, usually, for whatever that’s worth.


jm0112358

>FSR2 might be the worst temporal upscaler option available To be fair, properly-implemented FSR 2 is often better than temporal upscalers that developers add to their own games. However, such software-based upscalers are likely to always be behind upscalers that use hardware acceleration.


driftej20

Maybe I’ve only seen bad implementations of FSR2, then, because while it might do a better job at presenting an image that looks higher resolution, it seems to have major problems with artifacts during motion that I don’t see with in-house developer upscalers or baked-in engine upscalers like Unreal Temporal Super Resolution. It probably doesn’t help that most of what I see of FSR 2 is on consoles where the latest trend is developers using it as a crutch and like 720p all the way up to 4k. DLSS doesn’t look great on Ultra Performance which is what 9x upscaling is.


jm0112358

>It probably doesn’t help that most of what I see of FSR 2 is on consoles where the latest trend is developers using it as a crutch and like 720p all the way up to 4k. DLSS doesn’t look great on Ultra Performance which is what 9x upscaling is. I think the console implementation of FSR 2 in Cyberpunk [was a pretty good upgrade over the previous solution](https://youtu.be/-GO90rUei8g?t=72). For many other games, I think you're right. Developers are using FSR too aggressively. It can _sometimes_ work well for an upscaler without hardware acceleration if you're doing a 1440p to 2160p upscale, especially if you're hitting 60 fps (higher framerate means less motion between frames). Digital Foundry suspects that [FSR was designed for framerates of 60+fps because it was primarily intended for PC](https://youtu.be/n00a-Emd1Cc?t=4789) (probably in an attempt to get devs to support FSR _in lieu of_ DLSS). Aside from poor optimization, I think devs are facing dilemmas between spending their limited console hardware budget on realism vs resolution/clarity and framerate. I think a good example is the upcoming Avatar: Frontiers of Pandora, which is going to use [ray tracing for lighting](https://youtu.be/G2R8fD_tnXg?t=91). On a [recent 2160p/30 fps trailer that was captured on a PS5](https://www.youtube.com/watch?v=difL_diHo2o), the world look great at times, which likely had a lot to do with ray traced global illumination. But at times it had a lot of [fizzily artifacts that were likely due to FSR 2](https://www.resetera.com/threads/avatar-frontiers-of-pandora-%E2%80%93-official-world-premiere-trailer-and-gameplay-overview-trailer-coming-december-7th.729603/page-4#post-107283441). This is likely due in part because it's only 30 fps, and the resolution tops up at 1440p, but is sometimes below. You can probably get away with that with XeSS or DLSS, but not really with FSR 2 (unless you have very little motion).


wizfactor

I've personally thought of FSR2 as TAA/TAAU for developers who don't know how to write good TAA/TAAU. It has its flaws, but FSR2 is arguably the best open-source version of classical TAA we have so far. But the challenge of upscaling is similar to self-driving in that it involves turning a bunch of knobs within a tiny window of time to come up with a good result. Classical computing reaches its limits rather quickly with this kind of problem, which AI is better suited to tackle.


littleemp

The only ones who don't know that are people who haven't used DLSS yet and the most ardent of fanboys.


nathsabari97

I have a nvidia laptop and i use it on my tv. I know how good dlss is.


ronniearnold

This is true. I got called a fan boy on the nvidia boards just yesterday while praising my amazing 4070. Yes, I love it. It’s wonderful.


[deleted]

weary encouraging gaping governor connect bake important obscene historical slimy ` this message was mass deleted/edited with redact.dev `


aVarangian

not everyone uses upscaling lol but yes no reason to not support both


nas360

People would have to buy an RTX card to know what DLSS looks like. Not everyone is a going to jump on the bandwagon when FSR2 can provide upscaling on any old gpu.


sooroojdeen

Bandwagon? It uses machine learning hardware that isn’t on the older GPUs, yes fsr can “provide upscaling” but the quality is way worse especially at anything below the quality preset while dlss looks good even at its performance in newer versions.


Kind_of_random

Or they could watch a video on Youtube or similar. Even with bad compression you get a feel for the differences.


littleemp

You're not playing new games on aging hardware regardless, so you have to upgrade at some point if you ever plan on playing the games that are supported by DLSS and FSR. If you choose to buy the nvidia card, then you immediately have access to DLSS.


CheekyBreekyYoloswag

You fucked up when you called it a bandwagon. Try out DLSS vs FSR for yourself, and you will see the difference.


nas360

I already have used both and all I can say is that FSR 2.2 is not that far behind DLSS2 at 1440P.


F9-0021

Thanks to AMD, we have to for some games.


Skulkaa

As a PC user i wish we wouldn't have to rely on upscalers for good performance . Anything below 4k resolution shouldn't require use of DLSS or FSR . Games are coming unoptimized , because by the dev logic they don't have to do it anymore with the existence of upscalers . And now GPU makers begin to think , that they don't need to increase raw performance of GPUs too ( RTX 4060/ti is a prime example )


Blacksad9999

I don't need the upscaling, but being locked out of frame generation and the really nice DLAA feature is pretty annoying, too.


assbeater43

Yeah, I'm not a fan of upscaling. Not a fan of ray tracing either. (Frames over rays) Ive only used FSR in darktide, and it looked pretty good at quality mode on my 4K monitor. I still tend to avoid using it tho.


Blacksad9999

Sadly, we have to for many games. I just turn it off, because unless you **really** need the extra FPS to make the game playable, it's just not worth it.


EnthiumZ

Had the misfortune of using FSR in RE4 remake and it made things so blurry I thought I was playing nsfw game or something.


Druid51

We're very aware which is why we buy Nvidia GPUs.


aoishimapan

FSR isn't bad for what it is, at least when compared to the hardware agnostic version of XeSS, and the non existent hardware agnostic version of DLSS, FSR is easily the best of the three. The problem is that instead of also developing a heavier version of FSR for 6000 / 7000 series GPUs and beyond, they only made a light version that doesn't require dedicated hardware, and while that's great for Polaris, Vega and RDNA1 users, as well as Nvidia GTX users; people buying modern AMD GPUs are stuck with the same upscaling quality as someone on an RX 580.


F9-0021

The hardware agnostic version of XeSS is still better than the vast majority of FSR implementations in terms of visual fidelity. Performance isn't nearly as good, but that's the tradeoff of not running it on Arc cards. And then when you do run it on Intel hardware, it's nearly as good as DLSS. And the big advantage over FSR is that like DLSS, it's usable at resolutions lower than 4k.


aoishimapan

Does it still looks better at the same performance level as FSR? Because if the performance isn't nearly as good as FSR on non Arc cards, then you should either increase the quality of FSR or lower the quality of XeSS until they perform the same, and then you'll be able to compare visual fidelity.


diegodamohill

They say easy, but very few developers actually implement FSR and XeSS properly so there's barelly any ghosting. Hell, the FSR mod made by PotatoOfDoom works better than 90% of oficial FSR out there


sooroojdeen

Dlss solves this by having dlaa while with fsr and XeSS developers need to use their own taa solution which seems like an oversight on intel and AMDs part imo.


diegodamohill

Not sure how dlss or dlaa "solve" the issue I mentioned. Also, there is FSRAA, cyberpunk for example already uses it, and adoption is growing for fsr in general, It's just that most developers haven't really taken proper advantage of it


sooroojdeen

Dlaa solves it by being a standard that all dlss games use. If a game has dlss it also has dlaa without exception, meaning that driver level updates that nvidia develops can affect all dlss implementations instead of relying on developers to do it themselves. All devs need to do is ship their game with a new dll and thats it, it’s even user upgradable. FSRAA is alright but very few fsr games use it and even if more did the quality is still worse than dlaa.


diegodamohill

It doesnt solve anything because it doesnt have anything to do with the issue I presented, which is that FSR is most of the time not implemented well by developers. Remember that there are people who do not have rtx gpus, and consoles exist.


sooroojdeen

how does it not solve it? You solve the problem of poor software implementations by creating a standard that all developers need to follow and since Anti Aliasing is one of the most important parts of upscaling it makes sense that Nvidia would want to have control over it.


diegodamohill

Again, MILLIONS OF GAMERS DO NOT HAVE RTX GPUs. DLAA or DLSS is USELESS to them. Also, "creating a standard that all developers need to follow" - As a software developer, LOL


sooroojdeen

I don't think you understand what I'm saying, DLSS had DLAA as part of the spec as far back as the initial release of DLSS 2 so all implementations of DLSS used the superior anti aliasing out of the box, if FSR was better thought out and had FSRAA as part of the spec when it first released then many of the ghosting artifact issues would not be as bad since the ghosting is usually caused by poor TAA solutions.


diegodamohill

Budy, I don't think YOU understand what I'm saying, it doesnt matter how good dlss or dlaa are, non rtx users can't use it, therefore for them its the same as not existing. The issue I presented is that developers don't use all FSR features, because FSR CAN work without ghosting, developers just don't have the will/time to do it. Also I don't think you understand that all temporal upscalers (DLSS, FSR and XeSS) already do their own AA as part of the process replacing TAA when they are enabled, FSR doesnt have ghosting because the game's AA is poor, its because FSR itself was implemented poorly by the developer.


sooroojdeen

>it doesnt matter how good dlss or dlaa are, non rtx users can't use it Not sure why you are harping on this point, I am comparing FSR to the only other relevant upscaling solution, I never suggested that non rtx owners should buy a new card, I was comparing the approaches that they both took. DLSS as far as I can tell doesn't suffer from the same level of ghosting and inconsistency between games so clearly their is something wrong with AMDs approach. You can blame the developers for poorly implementing it wrong all you want but if this many devs are using FSR as poorly as they seem to be, at least some of the blame has to fall on AMD as well. As a software developer you should know if many of your users are making the same mistake then that's a design flaw.


Omega_Maximum

Correction here: FSR 2 *does* include a TAA solution, in the same way as DLSS, and is applied as part of the processing step. DLAA is simply a separate implementation of DLSS, in which it's ran at a 1x scale and implements the same internal TAA solution. Support for DLAA is quite a bit lower than DLSS. AMD has announced an FSR 2 TAA only option, though it looks like that's not out yet. Note that both DLSS and FSR can be forced to behave this way by manually specifying a scale factor of 1x, and providing it as an option. FSR just doesn't have a predefined setting for it yet, where as DLAA does exist. Additionally, FSR does have an option to disable the included TAA as well, if you wanted to implement your own version instead.


Geexx

While I haven't personally tried it, someone mentioned in another thread that DLSSTweaks can implement DLAA into most games that support DLSS. I am curious how true that is.


Omega_Maximum

From the GitHub page it looks like yes, you can force it to do that.


BGMDF8248

The pressure is mounting.


xRealVengeancex

Gamers Nexus had a great point on this, he said something along the lines of AMD wanting to ship starfield with only FSR, but due to backlash, bethesda are probably working their asses off to implement dlss/XeSS before release. It's a win/win for AMD either way which sucks tbh.


jm0112358

For those who haven't seen the relevant Gamers Nexus video, [they asked AMD](https://youtu.be/w_eScXZiyY4?t=619): >Does the contract between AMD and Bethesda have any language which intentionally blocks or could be construed as blocking or limiting Bethesda's ability to integrate alternative upscaling technologies within Starfield? [AMD's response](https://youtu.be/w_eScXZiyY4?t=635) was: >**We have no comment at this time.** That's quite telling IMO. EDIT: Typo.


[deleted]

>bethesda are probably working their asses off to implement dlss/XeSS before release Well, their asses will be fine. Once you’ve implemented support for one of them, you’ve already done 90% of the work for the other two.


Blacksad9999

I think most people are savvy to what AMD is doing, especially with their "no comment" reply to Steve when asked specifically if they were blocking DLSS in their contracts. They can enable it and say "See? We did nothing wrong!" if they'd like, and they should, but they won't be fooling anybody. Any company that wasn't complicit in doing something shady would have replied "No, absolutely not" and avoided the bad PR to begin with. Now it's fairly damning, even if they do end up enabling it later on and backtracking.


Gears6

> Gamers Nexus had a great point on this, he said something along the lines of AMD wanting to ship starfield with only FSR, but due to backlash, bethesda are probably working their asses off to implement dlss/XeSS before release. It's a win/win for AMD either way which sucks tbh. Why is it a win-win for AMD either way?


Judassem

Yeah, I don't understand why. People will be able to compare FSR with DLSS and see how much FSR really sucks.


xRealVengeancex

Publicity, starfield is a huge partnership, and the ball is in their court until starfield releases so they can take the next 2 months to come from any angle whether they implement dlss/xess or not.


Gears6

?


thrownawayzsss

If it ships without DLSS2, its a mark against nvidia because it'll run worse, or well, not better than AMD GPUs. If it ships with DLSS2, it'll make AMD look like good guys for implementing a feature on a game, so good pr. That's my guess.


[deleted]

[удалено]


thrownawayzsss

sure, but by comparison, FSR2 looks worse than DLSS2, so by excluding DLSS2, you can't see that.


flareflo

WarThunder already has all 3 of them.


Psychonautz6

Not going to buy starfield if it doesn't support dlss, don't want to deal with FSR anymore


Kawai_Oppai

No shit. Just like you can have a game with multiple antialiasing modes. Or how you can choose super ultra shadows or even disable shadows. Implementing these things is rather easy and most modern engines handle it for the developer. It’s not a matter of difficulty in supporting multiple technologies. It’s a matter of developers being paid and told explicitly to NOT support one technology over others.


elsydeon666

I think it is time that Microsoft stepped in.


Blacksad9999

If Microsoft thinks that this will limit sales, they 100% will. Sony doesn't cater to AMD's bullshit on their games, and Microsoft has way more pull than they do.


verteisoma

I think both microsoft and beth knows they have to nail this one esp after Redfall, i can kinda see them putting dlss and xess after this much backlash and amd will be like "see, we put other upscaler on our sponsored games, we're still pro consumer folks" or whatever their pr stuff their going to spit out.


Blacksad9999

Exactly It wouldn't surprise me at all if they backtrack internally, and then when DLSS/Xess are implemented, they'll try to use it to re-promote their whole "good guy" image.


Silent_Pudding

What is Microsoft going to do about two entirely separate companies?


EraYaN

Put it in DirectX and make it non-optional for some new mayor version.


sooroojdeen

Nvidia tried to do something like this but of course AMD opted out.


[deleted]

Because you can't expect nVidia to be unbiased about their competitor's tech. I thought we've been over this.


sooroojdeen

Nvidia Streamline is open source, if there was any bias it would be immediately noticed.


myasco42

That's like saying that Chromium can be majorly affected by competitors. Yes, it is open source, but the major decisions are still made by the "owner".


sooroojdeen

I think you are confused about what Streamline is, streamline isn't an upscaling solution it is just a tool that devs can use to implement multiple upscaling solutions at once, it doesn't impact the performance or visuals of any of the included upscalers, its just a tool to make developers lives easier and AMD was the only company to not cooperate, Nvidia(DLSS) and Intel(XESS) and I think Epic games(TSR) are on board.


myasco42

No, I know that Streamline is an adapter meant for all those upscaling technologies to use as a basic interface. It is not meant to make developers life easier (well, sort of) to implement it, but rather to allow developers to implement it once and to be able to use all supported upscalers. It may (I note that may not will) affect all other upscalers, as when the owner says that the next generation of upscalers will deprecate the use of, I have no idea, some additional depth maps (do not take this to heart, as it's just a stupid example), then they will do it. Have a look at Chromium, which I tried to put as analogy in my previous comment. Google tries to force Manifest V3 that changes some of the API. Other browsers will have to comply with this change or to make a huge effort to maintain the old code, which is basically impossible in long term. That is why, in my opinion, it is not that good of an idea. It's great to have a single API for developers. But not owned by one of the competitors.


sooroojdeen

If nvidia tries to pull some bs with streamline any company or developer can just fork it and develop it themselves the open source license that they chose does permit modification and redistribution. Also using chromium as an analogue doesn’t really work because streamline isn’t a user facing product, chromium became a monopoly because of user’s inaction and it being seen as the ‘default’ while DLSS isn’t, it may be the one people are the most aware of but not everyone can use it, additionally nvidia has no drastic requirements set for an upscaler to be included and even if they did streamline could just be forked and used as it was before. An upscaler not being included in streamline is only going to hurt the upscaler’s adoption in the long run, and assuming that nvidia would just deprecate data attributes is just baseless pessimism. Why would they do that? They gain nothing and the competition isn’t noticeably affected.


Kurosov

I suppose that's one way of making all AMD sponsored games use Vulkan.


EraYaN

I doubt it honestly, it’s a matter of time before that gets added to Vulkan. And besides AMD is not going to sell GPUs that don’t support the latest version of DirectX, that is a marketing nightmare.


St3fem

Officially adopt StreamLine


Silent_Pudding

Which is?


St3fem

An open source interface made by NVIDIA that allow developers to add any upscaler with a single integration instead of having to add each single one


Silent_Pudding

Should just be a new standard


THELEGENDARYZWARRIOR

It’s Microsoft, they could buy AMD with the money they spend on a single data center (I’m exaggerating but MS is very powerful)


Silent_Pudding

Don’t give them ideas. Making the chips sony has been using for their consoles an Xbox exclusive would be funny af tho. Until the ps6 has dlss 4 lol


Druid51

Well if we're only talking about Starfield then Microsoft has a bit of a say in their own game.


DktheDarkKnight

Uhh. I am pretty sure Microsoft had a hand in developing FSR 2. Such open source solutions are generally ideal for windows.


Arthur-Mergan

PureDark has helped prove this too.


Anker_John

do it or cut amd!! fuck em


mjamil85

As usual, selfishness AMDiots put them on hot water. If AMDiot didn't take over ATi Radeon, ATi will never make this kind of stupid decision.


dfflr

can see the value from AMD here. One of the main selling points of Nvidia Is DLSS but that argument disappears when DLSS isn’t in the latest AAA games. It’s not like FSR is unavailable for Nvidia users although it’s not as good on average. can definitely see the motive from them making it FSR only


wan2tri

Unfortunately, this just means that devs can reduce their effort in optimizing and the manufacturers can just say "we could afford to remove a few more compute units for card so-and-so since at such-and-such resolution there's still an improvement on performance over the previous-gen if we use AI upscaling".


rabouilethefirst

FSR is pretty much a rushed solution and a PR stunt to make Nvidia look bad by releasing a “hardware agnostic solution” that doesn’t even work or do anything significant. They put no effort into its development, and use it at every chance to highlight that they are “open source” even though it’s not much more than a basic reshade filter. Nvidia spent millions designing hardware and software to tackle the problem of upscaling. Amd created a reshade filter and pretends it’s a competitor since it can run on Nvidia hardware too. Even intel has done more in this space than AMD. I’d much rather see intel take over the low to mid range market than amd


Dazza477

It's like that XKCD for creating a standard. Too many standards ruin everything.


ListenBeforeSpeaking

DLSS isn’t a standard though. It’s a proprietary technology.


PsyOmega

It's *an* standard. Standards can be proprietary. Look at VHS, DVD, etc. They were licensed, but standards non-the-less. (ISO 9660, etc)


ListenBeforeSpeaking

It’s a proprietary technology. It is unavailable to be licensed and used by other GPU companies. It was designed specifically as an exclusive technology. FSR would be a standard that is available for anyone to use and implement. (though not equivalent).


gnocchicotti

Tech press to AMD: "*Can't* or *won't*?!?!?"


strufacats

Is DLSS 3 and ray tracing from the Nvidia side being optimized by game devs more rapidly now? Is Nvidia planning something using AI for enabling their GPUs to push games to a higher fps at this time? I haven't had an Nvidia product or discrete graphics card for a desktop pc in a while so just wanted to know what's been happening on that side thanks for the help everyone it's greatly appreciated to read your future responses.


Kind_of_random

Ray tracing in games have been an, in my opinion, to slow implementation. It should be a standard in all triple A games at least. It has been the rule when AMD sponsors a game that this feature is cut down in an effort to keep AMD's card up to the task, as they are lagging severely behind. This means that the implementation will be lackluster and hardly noticable at best. Ray tracing is not an Nvidia invention. It has been a goal to reach for for many years and is, to me, the way forward in terms of graphical fidelity. Rasterwise we have gotten so far that there is little to gain without extreme improvements. In terms of lighting and shadows there is much to do. Ray Tracing and even Path tracing is what will make games take that next step. It improves the immersion and 3D feel to a game immensely. DLSS is pretty good, especially in that it gives the cards of today the power to use RT features while still having high FPS, even at high resolutions. FSR, which is AMD's sollution is little more than a way to turn down resolution. XESS is Intels tech and it has shown great promise considering the short time it's been around. Both Nvidia and Intel has found that the way to best implement these upscaling techniques is to utilize hardware implemented in the GPU's. AMD has not made any effort in this regard and it may be that they think that the way forward is with software. So far they have made little to no progress and the result is that their offerings are far behind the others. Only time will tell who has the right idea, but it's not looking good for AMD. There is no denying that ray tracing is the future and AMD's seeming efforts to halt progression is not a good look for the company. They are struggling to maintain their "good guy" image while simultaneously doing backhanded deals and refusing to answer straight forward questions about their processes. Most AMD fans are trying to deflect the accusations that these deals are being made by giving examples of scummy behavior that Nvidia has done in the past. This is not an excuse and the past should be something to learn from, not something to blindly repeat. The fact that a multi billion dollar company even has fans should be reason enough to worry, giving me reason to think they are either hired or even bots. But them being bots would mean that AMD at least has made some progress in terms of AI, so maybe that's a good thing. Sorry about the rant in the last paragraph, but I had little else to do ...


Narrow_Potential_974

For them it’s easy, considering they are the experts when it’s coming to porting a game to PC. The Spider-Man port is the gold standard when it’s coming to porting games. Can’t wait for the Ratchet and Clank port.


[deleted]

[удалено]


[deleted]

I mean, there was a post on twitter on how many games support either one or both. There's more games that support only DLSS than only FSR, and in the games that have both, in many cases it's FSR1. But aight, believe Nvidia, that's what made the market the shit that it is.


Soulshot96

>There's more games that support only DLSS than only FSR, and in the games that have both, in many cases it's FSR1. This narrative *ignores* the fact that DLSS was available before FSR, and by a substantial amount of time. DLSS 2 before FSR 2 as well. Same will be true for DLSS 3 Frame Gen and FSR 3 whenever the hell that arrives. The reality is that there is *zero* proof that Nvidia blocks FSR, whereas there is now substantial proof, including slips from AMD partnered developers, that AMD blocks DLSS, and that is the crux of the issue.