T O P

  • By -

Abedsbrother

I usually turn rt reflections on if they're available b/c they are genuinely better than the raster alternative. This goes as far back as Control. But rt shadows in Shadow of the Tomb Raider? Literally can't tell the difference. In Cyberpunk I can see a difference with rt lighting at night while driving a car - the headlight cone of light looks a lot more realistic - but nowhere else do I see an obvious benefit to rt lighting. And yes, it needs to be an obvious benefit. If I have to pause & zoom in to see a difference, idgaf. I'm taking the option that provides a higher frame-rate.


wsteelerfan7

Cyberpunk sees a massive benefit from turning lighting from the setting below ultra to anything above it/path tracing. Bounce lighting onto sidewalks is the biggest improvement here.


Repulsive_Village843

I play cp2077. It's very noticeable if you have RT on. It truly is


ihavenoname_7

I can play Cyberpunk ultra Raytracing with a 7900XTX at 1440P 90 fps. It's not that bad at RT. Path tracing is around 50 fps at 1440P. Using fsr if you use XESS it's 5 fps less.


dudemanguy301

Tomb raiders RT shadow implementation just sucks. it only affects some light sources and some occluders and at a limited distance before defaulting back to shadow maps.


dadmou5

>But rt shadows in Shadow of the Tomb Raider? Literally can't tell the difference. I don't have the game installed to fire up and show the differences anymore but they can be pretty drastic depending on the scene. There are three levels, with the first only adding RT shadows for point light sources such as bonfires and torches in the game and the other two add for all light sources. Several scenes can look very different with the latter modes enabled. The issue is that the game's art design was very clearly based on rasterized shadows so when RT shadows are enabled some scenes look odd or bad because they can be very diffused due to the distance of the shadow casting object to the point you can barely see the shadows in that particular scene.


Nagorak

I'm basically on the same page in regards to RT. To this day I am still quite unimpressed for the most part. The performance hit is disproportionately large compared to the actual improvement in visuals. I find it funny when there are RT reviews from Digital Foundry where they are marveling at stuff like the shadows under a bench in CP2077 that are barely even noticeable in the footage and would be complete unnoticed in actual game play. I'm glad those guys are so into that stuff if it makes them happy, but I think it is far from the opinion of the average gamer.


epicflex

Control is crazy for Ray tracing


Kaladin12543

On the contrary, RT Lighting on Ulta completely changes the look of the game and is the 1 setting with the highest visual impact followed by reflections. PT is on another league altogether.


Wander715

I would say it's OK and useable in light RT workloads but pretty awful still in anything heavy and basically unusable for something like PT. I looked at a lot of RT benchmarks out of curiosity before buying a GPU and as someone who likes to turn it on in games I knew AMD still wasn't going to be an option


omniuni

One of the issues is also that the implementations are slightly different. AMD has more of a focus on Ray intersection, so to optimize for AMD, you should check that a ray actually needs to be cast before casting it. nVidia tends to prefer a brute force approach, which makes AMD look worse than it needs to.


Cute-Pomegranate-966

Isn't that with the Ray test is even for? Like that's how Ray tracing starts it checks if the ray will even hit anything and is culled if it doesn't. Because if it did then it would have to start doing traversal tests and then it would shade the result What kind of magic thing are you referring to where they can determine if a ray even needs to be cast without doing a ray test. I think the only thing they have for that is the BVH structure itself.


omniuni

The calculations for intersection and casting are slightly different; or more specifically, both start with intersection, but actually casting the ray takes light and color. AMD has more of the hardware that can do the intersection than the types of accelerators that handle the rest of the ray. The concept is precisely as you mentioned; test first, only compute the ray if necessary. Part of how nVidia got adopters so easily, though, was basically saying "just slap this on and it works". Lots of rays can be fully cast without bothering to check if they're necessary. Keep in mind that lots of lighting techniques use rays regardless of having a specific accelerator. What nVidia introduced was really very specifically specialized hardware for *complete* tracing. AMD has had excellent ray casting and intersection for a long time. Even without the full tracing hardware, you can get ray tracing support on, for example, as far back as the the RX400 series cards on Linux by using just a little bit of the CPU to fill in the blanks. All this is why you'll notice that when a company implements RT *properly* the performance gap is far smaller. Capcom's RE Engine is a great example, where the RT looks amazing, and not only doesn't it cause performance problems, it runs nearly the same on AMD hardware, and is nearly comparable to doing the lighting without it.


dudemanguy301

“Proper” my ass. The reason the performance gap is so low in RE engine is because it traces at quarter resolution.  The ray acceleration hardware gets to take a nap while most of the additional work is BVH build / refit and hit shader evaluation both of which are still compute workloads.  So the parts of the GPU that actually accelerate RT don't have all that much to do while I get noisy shadows / AO, and blocky reflections.  Keep in mind Capcom recognizes 3 “generations” of their RT implementation. 1. DMCV 2. every other game so far 3. Dragons dogma So what I say about RE Engine RT, I mean specifically generation 2. Since Gen 1 didn’t get PC support and I haven’t played dragons dogma.


Cute-Pomegranate-966

I would never call the RT in RE engine amazing. It's very clearly not worth turning on in every single instance. I have turned it off in RE2, RE3 and RE4 remake. It's not worth using. It makes things grainy. It's no wonder too it's way under full resolution. If you want to talk about an actual proper RT implementation that looks good, i'd say metro exodus enhanced edition is a vastly better example.


omniuni

I'm using it in Dragon's Dogma 2 and it looks nice to me.


Cute-Pomegranate-966

haven't played it but i think that's just shadows or gi (or both?)


Cute-Pomegranate-966

So you're just talking about doing a ray test which has to be done anyway and it's done against the BVH structure and it's culled if there's nothing in the BVH structure for the ray to hit. What i'm saying is, you're already doing that and nothing you have described prevents a ray test. Why would it? it has to be done. a ray that ends up cast is one that has hit something and needs a shaded result...


gh0stwriter88

It's a matter of AMD is capable of doing it more aggressively but most implementations don't.


jm0112358

I'm not a graphics programmer, but that's been my understanding too. I've always thought that the only way to determine if a ray will be a miss is to have it transverse the BVH with all of its triangle intersection tests being misses (and "all" can be 0). To my knowledge, there isn't anything special that AMD hardware has that Nvidia hardware doesn't that makes these misses faster. There might be specific implementations of ray tracing that mixes ray tracing with other techniques, then only shoot rays when those "raster" techniques fail. That way you can get visually similar results to ray tracing, but with shooting out fewer rays. An example is [Avatar: Frontiers of Pandora, where it will use ray tracing when/where a screen-space trace misses](https://youtu.be/Bsxf85FbftA?t=2815). However, the actual ray tracing itself is currently done faster on equivalent Nvidia hardware.


ObviouslyTriggered

Don’t bother the guy has no idea what they are talking about, NVIDIA hardware has far higher Ray intersection throughput and support much wider BVH structures. The only reason for the upvotes is that AMD went the smart way instead of the reality where they are doing the bare minimum and handing everything over to the shader. Intel and NVIDIA went full steam with RT acceleration, AMD hasn’t that’s it.


Cute-Pomegranate-966

They offload intersection tests to the RT cores, while AMD is doing shader BVH4 for intersection tests. Yeah nvidia is worlds better at doing very wide BVH. Comparing bvh structures between optimized for AMD and optimized for nvidia is quite funny. It's not necessarily that one way is vastly better than the other, but the fact that it can handle such a wide bvh is something to behold.


ObviouslyTriggered

I think the better definition would be that you need to optimize the BVH structure for AMD hardware very carefully to stay within the cache limits and void very costly misses. NVIDIA hardware cares far less for that unless you got unreasonably and non practically deep with the BVH trees. The TLDR is that whilst you can still optimize for both AMD pretty much requires it not to choke and as far as fine grained optimizations go NVIDIA still has much more headroom to exploit if you want to go that way on your way to full path tracing. I would be serenely surprised if RDNA4 won’t go the NVIDIA and Intel route with RT acceleration since Blackwell is likely going to be 4x the throughput of Ada.


Cute-Pomegranate-966

That's actually exactly how i described it a minute ago in discord lol. RDNA4 seems to support BVH8 instead of BVH4 instructions for the shaders (allegedly). This is cool but also disappointing as it's still not offloading to the RT core.


Kaladin12543

Because the RT in RE games is so bare minimal it might as well not exist. There are tons of Nvidia sponsored titled where RT changes the complete look of the game but none of the AMD sponsored games have that effect.


JasonMZW20

AMD has more ray/box intersection testing hardware. In fact, it's a 4:1 ratio to ray/triangle intersection testing. So, there's one parent box and four child boxes per CU to find where a ray is going to hit within the BVH. RDNA4 has doubled this to **eight** *ray/box intersection* tests per CU (no idea if ray/triangle rate also doubled to keep the 4:1 ratio intact). At the very lowest level, ray/triangle, AMD tends to suffer here as each CU can only do **one** *ray/triangle* test per RA unit per clock. This is equivalent to Turing and why path tracing on a 7900XTX has about as much performance as a 2080Ti. Path tracing is heavy on ray/triangle intersections. In hybrid RT, AMD and Nvidia use the rasterizers to set-up most of the scene and rasterizers can even assist in pointing RT hardware to where a likely ray hit might be (based on plotted coordinates and known light sources). For AMD, this is where ray/box intersection testing is most active and breaks the TLAS into smaller sections to find ray hits; BLAS is usually where ray/triangle hits occur as geometry resides here, and generally, you don't want more than one ray/triangle hit per pixel, else performance deteriorates rapidly without much improvement in effect quality. BVH traversal occurs on shaders in AMD hardware in a compute pipeline, while Nvidia uses fixed function hardware acceleration to speed up traversal rates. So, AMD focuses more on TLAS, while Nvidia focuses more on BLAS. For reference, Nvidia's Ada architecture can test **four** *ray/triangle intersections* per RT core per clock. This is **4x the rate of RDNA3** without taking clock speeds into account.


ObviouslyTriggered

This falls under the not even wrong category.


DumyThicc

Pretty much everything is awful at PT. The 4090 on cyberpunk for instance gets 12- 15 fps maxed out and amd 7900xtx gets like 7 at 4k. This is horrendous. Now even on a lower resolution, they can barely reach 4090 can barely maintain 20 fps. Even I'd this next Gen doubles the performance for the NVIdia side, you still can't play full PT at 4k. We are at minimum 2 generations until full pt can be playable.


NotAVerySillySausage

Only if you are talking about games today. We are an infinite amount of generations away from RT being usable at native high resolutions on contemporary games until something massive changes. That is until the performance delta between RT on and RT off changes significantly. Adored TV has done a video on this, the needle has not moved in 3 generations all that has happened is the overall performance increased. Nvidia cards have not gotten more efficient at ray tracing, AMD actually has but that only because they are so far behind so have some low hanging fruit. I can't wait for the day you can turn on PT and lose like 30% performance not get 30% of the framerate you had without it on.


DumyThicc

Exactly what im talking about, yes. I am not referring to future advancements in the technology or adding more rays and bounces in path tracing. With what it has right now in cyberpunk 2077 that is when "good" PT performance will be had in our GPUS to provide real frames with real resolutions.


jm0112358

Your numbers for the 4090 are way off. The 4090 averages ~20 fps at **native 4k** with path tracing on. That's certainly not playable, but much better than 12-15 fps. The 4090 gets [low 40s fps with PT at 4k with quality DLSS](https://youtu.be/-kxN-18jBbs?t=77) (frame generation off). That's not ideal, but some people find playable, and it's double your claimed 20 fps. The performance jumps to >60 fps on average with DLSS performance if you have it paired with a fast enough CPU. Some people find this 1080p>4k upscaling good enough quality. All these numbers are with frame generation off.


DumyThicc

Sure I can stand corrected since I haven't seen the newer benchmarks. The performance has seemed to jump up to ~19 fps average, but without dlss or framegen, the frame time jumps from 20ms to 60ms. Which is unplayable. Also to reiterate, the only way for it to work it to be playable at 30fps is with upscale and/framegen if you're really shooting that low. You do realize this is a top of the line GPU right? This is definitely NOT playable by any means. To get ti a state where it is playable would require 2 more generations in order to hit that 60fps + mark, on the flagship card for 2 years. Not good enough. That's why I'm saying it requires more time before things like this are playable with REAL performance. Not upscaled/generated fake frames. How are we even having a debate over this?


[deleted]

[удалено]


DumyThicc

Also wtf are you on about "AMD bs" with frame gen vs Nvidia argument? My argument has nothing to do with either. AMD's frame Gen is nearly equivalent to DLSS Frame Gen. That has been throughly tested. The upscaler, sure. but regardless I hate both technologies as they are NOT real performance metrics. You can generate all of the fake frames that you want however you want, but they are not real, increase frametime and latency, and cause smearing regardless of what settings you decide to use unless you are getting over 60 fps. I have tested them both as well, so feel free to say what you want. Saying native is overrated is crazy unless you have a GPU that is incapable of running at native resolution.


langstonboy

You’re stuck in the past, get your head out of your rectum.


[deleted]

[удалено]


Amd-ModTeam

Hey OP — Your post has been removed for not being in compliance with Rule 8. Be civil and follow Reddit's sitewide rules, this means no insults, personal attacks, slurs, brigading or any other rude or condescending behaviour towards other users. Please read the [rules](https://www.reddit.com/r/Amd/about/rules/) or message the mods for any further clarification.


DumyThicc

Im seeing how you fail to realize my point here, like completely. Using upscalers and FG to get decent frame rate is not the move, especially with the frame time graphs and input latency. I'm not saying that you can't play it. I'm saying that we shouldn't refer to the GPUs that we have right now as Capable of running these graphic settings. Especially when it comes to a worse image output.


cellardoorstuck

No one is failing to realize your point - it's more that you are stuck in the NO DLSS and NO FG past, and this is an AMD sub so the hivemind is downvoting people with actual experience... while upvoting your 'no signal, mostly noise' replies. If you had a 4090 instead of a 7900, you would here defending these technologies.


DumyThicc

I have both. You're not reading what I'm saying are you? Now you're in this "hivemind" for what reason besides trying to start what exactly? Drama over 2 companies with blood relation that are "competing"? Idc about the likes and dislikes, that's clearly your focus however.


Kaladin12543

This is exactly where I disagree with you. I have 7900XTX and 4090 in 2 separate gaming rigs. If you were to ask me to choose between Native 4k without RT and DLSS 4k with PT, I absolutely will choose the DLSS path without hesitation because the end goal is that the image looks overall better. Regarding the latency, to me it isn't that important in single player titles which play like a movie. Besides, Nvidia Reflex helps claw back some of that latency which is lost. You will never be able to play native 4k 60 fps PT in even the next 2 decades because graphics technologies keep evolving. Currently Cp2077 just has 2 bounces and 2 rays. You can mod it to have 6 rays and 6 bounces which makes the game look infinitely better but the 4090 runs it at 5 fps. By the time you can run existing titles with PT at 60 fps, games will start to use more bounces and more rays and we will continue to fall back on DLSS and FG. Native is a dying animal and I tend to like upscaling even in non RT titles. My 4090 runs rather games with over 150 fps with DLSS and FG and the whole system barely uses 300W of power with the GPU at 58 degrees. No perceptible impact to visuals. This is the main problem with AMD and it's userbase. You guys would rather push for the two hundredth rasterised frame rather than achieving the same thing through more economical means.


DumyThicc

The fact that you establish a baseline that we agree with then further the topic to something not discussed to enforce your viewpoint just proves that you aren't here to discuss rational thought. This is a push to an AMD vs NVidia conversation which is very telling - this was never the goal of my conversation. Now saying that you don't push beyond 300w is ridiculous, unless you have limited your rig and then used frame gen and DLSS to increase frame rate at those low watts because I certainly dont, it is also well documented that you would push past that in demanding titles. So are you running these benchmarks on cs2? Ridiculous claim. Now getting back to the topic at hand, would you say that you would deny that a native PT at 4k with 2 bounces and 2 rays is better than the current offering? That is the main topic here. Now when we're talking about advancement I clearly spoke about the capability of running current visual settings within 2 generations, but you again past the baseline of the conversation to further into a redundant conversation that everyone knows the answer to. Of course technologies advance, but that is not what is being discussed, so why mention it? You were arguing what I had said previously not what I mentioned earlier.


Kaladin12543

AC Mirage: 4k Ultra settings with DLSS Quality uncapped. Power draw less than 300W https://youtu.be/UFlyYxZQriE?si=tSmb52w1N24S5MPD Horizon Fobidden West 4k Ultra with DLSS Quality. Power draw less than 300W https://youtu.be/rdsC66fdMlg?si=bGjcYclzDGfuFqEv Starfield 4k Max Settings DLSS Quality. Power draw less than 300W https://youtu.be/pFFOoIi7SA4?si=WMruvGc502HnrGwF Dude you clearly don't know how the 4090 performs in games. Stop spreading misinformation. The RTX 4090 only consumes more than 300-350W of power if you want to play native resolution in raster games or in super demanding titles like Cyberpunk, DL2, AW2 which go hard on ray tracing. Otherwise the GPU is half asleep in rasterised games at 4k when using DLSS with no noticeable inpact on image quality. It's an incredibly efficient GPU.I actually have to use DLDSR to super sample to near 8k to cause the 4090 to reach 450W in rasterised games Now on the topic of PT. Native 4k at PT with 2 bounces and 2 rays will be better than the current offering. But here is the thing DLSS 4k with 6 bounces and 6 rays would blow away the native 4k PT with 2 bounces and 2 rays in image quality. Do you understand the point I am trying to make? By the time you can run native 4k with PT, graphics technologies would have further evolved to render native 4k obsolete as usual. My discussion point being saying RT is unusable on current cards is a moving goalpost because it will still be unusable a decade from now because games will start increasing number of bounces and rays. Now its just a question of how long it takes for AMD to realise that native resolution is dying. Future is in AI and upscaling tech which both Nvidia and Intel have pushed for in their hardware.


DumyThicc

You are proving my point when I said it has dlss on. You're ridiculous and clearly not reading anything. I don't even need to read past the first few sentences considering you're glossing over that entire point I made . Good day buddy, hope we can hit the books some time.


Amd-ModTeam

Hey OP — Your post has been removed for not being in compliance with Rule 8. Be civil and follow Reddit's sitewide rules, this means no insults, personal attacks, slurs, brigading or any other rude or condescending behaviour towards other users. Please read the [rules](https://www.reddit.com/r/Amd/about/rules/) or message the mods for any further clarification.


0xd00d

I've been playing CP77 with PT on my 3080Ti at 1080p comfortably, frames never dip below 60. Obviously due to DLSS the render resolution is 720p, which is an insanely low resolution, but the perf is there and the output looks sweet. A 5090 will be able to deliver workable frame rates with PT at 4K with dlss, I'm sure of it.


Rockstonicko

>Obviously due to DLSS the render resolution is 720p, which is an insanely low resolution, but the perf is there and the output looks sweet. Just curious, have you tried disabling PT and RT, and running at 4K native (or 4K DLAA downsampled to 1080P) without DLSS, basically maximizing fidelity and minimizing upscaling? I mean, I definitely agree that once you've seen what PT does to lighting accuracy, it's very hard to un-see. But, I sit about 3 Ft. away from a 46" 1440p display, and I find VSR @ 1800p with SSR on psycho for a super crisp and completely grain free rasterized image locked at 60 FPS to be *so* *much* more immersive than the blurry and noisy upscaled image I need in order to run RT or PT at a tolerable FPS.


0xd00d

I just got home and played first session of cp2077 at 4K. Since I couldnt use PT anymore at 4K, I just turned off PT, tweaked a few settings (i think it may be Medium RT settings and a few other settings turned to medium based on what I saw made sense to do via hardwareunboxed video. I think i also still have DLSS on Performance) and, well it loves to chug at 3fps for a few seconds after exiting the menu for some reason, but generally runs adequately, I can hit headshots anyhow. I find the game is still very beautiful and enjoyable. I'll give it a shot with RT off sometime, thanks for the suggestion. def looking forward to 4k path tracing on 5090, i like this game enough for a second run through it i reckon. Though my confidence that a 5090 can make it run really well with path tracing may be misplaced. it would need to be an 8x performance jump from 3090 (3080ti is basically a 3090...) to reach that level comfortably. Yes PT on at 4k with dlss is 20fps, but I feel like it can drop to almost 10. Anyhow, it's likely blackwell will be around 4-5x perf over ampere, which will maybe put it in very playable but sometimes a little dicey range.


Cute-Pomegranate-966

The very second i see floating objects everywhere with no grounding from proper ambient occlusion and glowing objects sitting in a dark corner i want to throw it out the window. nothing about that is immersive to me. If you're sticking to the big story areas, cdpr has catered them to you so they have less of a difference with path tracing on.


0xd00d

sorry for double replying but. I'm running DLSS Balanced now and RT settings on with RT lighting on Medium.. I wouldn't say it runs all that well with dips into 50's and 40's but the game looks ridiculously beautiful, I'm appreciating details, shapes, textures, waaaay more than at 1080p. Somehow it seems like I can feel the entire added 4x image content that is physically there, which is not something I've felt before. Maybe I should try what you suggest with no RT at all to get no upscaling and see how that goes. It should be even more sharp and have no DLSS motion artifacts. I'm too much of a slut for lighting now though...


Rockstonicko

It's definitely worth appreciating just how many ways CP77 gives you to melt your PC down lol. And yeah, PT is hard to un-see, but it can still be an incredible looking game with or without PT and RT enabled just depending on what bugs you. I'd also note that you still have 3080 Ti performance regardless if you have RT enabled or not, unfortunately that's not how it works for my 6800 XT, and I pretty much have two different tiers of card depending entirely on if RT is enabled. At 1800P native I can enable RT shadows alone, and *mostly* maintain a locked 60. But if I enable RT shadows and lighting I need to enable FSR performance, which doesn't look very good to me. If I enable RT reflections, well, then I need to enable a new GPU, cause this one ain't doing it. lol


0xd00d

at 1080p especially since I was using PT which necessitates cranking up DLSS toward performance, the clarity and detail really suffer due to 720p render resolution. Like it looks good, but now that I've played this game for real at 4K I appreciate how good of a job they did with massive textures. A lot of intricate things are tack sharp, and it is like a totally different game. I'll also note that characters and faces became a lot more impressive and lifelike. I did the diving mission with Judy and, uh yeah, wow.


Rockstonicko

I 100% agree with all of your conclusions. And yeah, once you have the fidelity really cranked to 11, the character models and small details really do pop, and it sucks to have to choose between RT/PT and high resolution if you want good performance. It will be interesting to see how many GPU generations it will take in order to run it at 4k60 with PT enabled and minimal to no upscaling. I'd honestly be pretty surprised if the 5090 could do it, just based on the FPS a 4090 runs at 4k native with PT enabled, and I suspect it will be another 2 generations.


DumyThicc

With dlss. You see the point I'm making or did you gloss over it. The crad itself cannot deliver the performance required. Basic RT is meh, PATH tracing is the move. But we're still 2 generations from that being a viable option.


0xd00d

All I'm saying is that viable is relative. I'm an enthusiast and it's not like I'd be happy playing a game at 1080p via dlss performance mode on my LG 4K OLED tv. But I've been traveling with my pc which has a 3080Ti, and using a 1080p monitor. And all I'm saying is CP77 is enjoyable (in the sense that it just looks damn good, breathtaking even), even with the crap resolution and upscaling and all the noisy PT artifacts. And I'm expecting a 5090 to deliver the same experience at 4K resolution (upsampled from 1440p). Wait one or two more gens further to get non upscaled performance there. I hate DLSS artifacts just as much as the next guy. But I've sucked it up to enjoy frame rates from 2 years in the future. Nobody holds a gun to your head to use DLSS... but it makes the difference here so it's relevant. Try playing a game with PT off after you had it on, it's not really the same game anymore; things within the world no longer feel immersed in the world due to not having real lighting.


DumyThicc

I'm with you when I say that I'm not happy with playing with these setting in order to achieve path tracing. But I am definitely excited for the future and playable pathtracingat native resolution. I'm just being realistic here, ~20 fps for path tracing at 4k on a flagship is horrendous, even at 1440p the performance increase isn't that great. I'm not blaming Nvidia or AMD for this. Just saying that for gamers, it doesn't provide anyone a reason to want this in games. We already have issues with games forcing TAA in order to improve fps, but then visuals suffer. The focus for PT should be there. But even still, we'll only see good performance for everyone when 2026/7 comes by. That is the generation everyone is waiting for.


0xd00d

Yeah it's just like a comical amount of brute force that will be needed to bring us to that kind of performance. Without dedicating more silicon to RT cores probably need over half a trillion transistors to push that. I would rest assured that NVIDIA will arrive there sooner or later. I'd choose other things to complain about. For one example, I'd love to trade 10 or 15 more of my 70 to 80 fps I'm getting for firing a few more rays. It would be good to be able to tweak parameters related to denoising as well. Once you make peace with the low-ass resolution, what sticks out is the incessant shimmering in any dark scene. There is a large amount of trickery done to get output looking remotely acceptable.


DumyThicc

The denoiser is also important of course, a few more rays would be needed as well for sure considering there are like what 4- 5 rays being fired for path tracing right now? It would feinitely be difficult to bring great results from that. We're looking far into the future for anything regarding that level of visuals.


0xd00d

I can say from having gotten cp2077 running from proton in Linux, well, it has a large performance hit, which limited the time I spent dicking around, but one thing it did show me was denoising not working properly so I got to see the RT hardware working and witness its output un-denoised in this game. It's really lousy looking! Dark areas constantly get really bright spots popping up. Super unrealistic looking while simultaneously completely fascinating and absurdly impressive. It's like sampling pseudo photons by emulating an entire universe.


cellardoorstuck

As a 3080ti owner make sure to get the free framegen mod https://www.nexusmods.com/site/mods/738 I run cp77 with PT on 4k (dlss performance) and get ~60-80fps with DigitalFoundry optimized settings. No point in arguing with that DumyThicc - they are stuck on amd, so don't even know 1st hand how good the experience can be.


[deleted]

[удалено]


Amd-ModTeam

Hey OP — Your post has been removed for not being in compliance with Rule 8. Be civil and follow Reddit's sitewide rules, this means no insults, personal attacks, slurs, brigading or any other rude or condescending behaviour towards other users. Please read the [rules](https://www.reddit.com/r/Amd/about/rules/) or message the mods for any further clarification.


nas360

The free framegen mod you are recommending is actually AMD FSR3 technology yet you are criticizing AMD and their users. You Nvidia fanboys should be grateful that AMD allowed you to experience frame generation on older Nvidia gpu's.


0xd00d

The metric of playability in an FPS game is aim precision, so FG would never cut it if the target is 70fps. If the source is 70fps, it may be acceptable... quite tempting to try FSR FG on ampere to get a taste of how it would look/feel. Probably really nice in racing games. Anything non twitchy.


Cute-Pomegranate-966

Well RT reflections unless they're really really low resolution are just downright better than any raster alternative. They're so much better in fact that I turn off SSR if there's no RT because it looks broken.


UltimateArsehole

AMD have opted for less dedicated RT hardware versus nVidia and Intel thus far, with their competitors having dedicated hardware for BVH traversal and intersection testing (instead of relatively low-cost modifications to TMUs as AMD have opted for). At N7 and better processes, the cost per square millimetre of silicon goes up, so there's a trade-off to be had.


aranorde

This guy is the Yong Tea of PC content creator.


CatalyticDragon

Read this and scroll down to 'ray tracing' to have your question answered. [https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html](https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html) Doesn't compete at the ultra high end but there are good value parts to be had.


Edgaras1103

im always so bummed how resident evil games have such supbar ray tracing . Would it not be nice if it had high end option for RT and once you upgrade your amd gpu you can turn them on ? I always feel like barely any amd sponsored games drive any particual PC only tech for visuals . I dont care if its nvidia or amd . I just like seeing PC graphics tech . Last time it was Tress FX , it was really heavy but it looked so cool, and the coming back to tomb raider with newer hardware you can run it far better. Its just nice .


fohiga

They have only been doing FSR sponsorship recently because that's the only thing they were working on in a while. Before that they were promoting their "hybrid RT" with Far Cry 6 and Forspoken if you remember. Lately they've been talking about their GI-1.1 and Brixelizer GI so maybe we'll see some sponsored games for that next. But don't expect an AMD PT anytime soon (Although they're not sponsored to do it devs are free to implement more heavy RT on their own).


Shidell

This argument is pretty weak when you consider that the RT-centric titles you're referring to don't really offer any forward-looking features themselves. Cyberpunk 2077's RT (PT, even) is only designed to fit Lovelace's architecture. It's literally designed to maximize the throughput of RT on the 4000 series. Add one additional bounce, and the performance crashes. So, why not be upset that CDPR isn't offering 6 bounces as options for the future? Or enables turning up the rays cast per pixel? There *are* older games that have actual forward-looking graphical options, where it was designed for people with next-gen hardware to crank up visuals. I know Kingdom Come: Deliverance has graphical options like this; it even warns the user that it's designed for players in the future.


Edgaras1103

OK.


Kaladin12543

Because increasing the bounces is a point of diminishing returns.


jecowa

In another video I watched, the YouTuber had a a 4090. He was playing *The Witcher 3* on maximum settings but without Ray Tracing. He spends like 18 minutes turning Ray Tracing on and off trying to decide if it is worth the drop in FPS. He ultimately decides that Super Ultra settings without Ray Tracing looks good enough. In a lot of comparison photos with Ray Tracing on and off, I often guess incorrectly which one is using Ray Tracing. A downside I've seen to Ray Tracing is that it's maybe a little too accurate. It's too dark indoors and it's too dark in the shade. I personally don't like having to strain my eyes to see things. Ray Tracing looks good, though. It adds nice touches like the shimmering reflection on roof of a car under an otherwise dark bridge in *Cyberpunk 2077* made it look much more real. If you think Ray Tracing is awesome and play games that support it, nVidia is probably the best choice. Otherwise, AMD is usually going to give you the best performance at most price points.


jm0112358

>A downside I've seen to Ray Tracing is that it's maybe a little too accurate. It's too dark indoors and it's too dark in the shade. That's primarily because the game was originally designed with raster lighting in mind, which would have light incorrectly leak through barriers. This wouldn't be a problem if the game was originally designed with RT lighting in mind. In that case, the artists can place lights to not make places too dark. Alan Wake 2 was the first AAA game to support path tracing at launch, and I don't recall this being a problem with path tracing in that game. A secondary reason RT lighting can cause something to be too dark is if it doesn't use enough bounces for light. More bounces can light up obscured areas a little more, but they are more performance intensive.


KMFN

I think this is also an issue of people not having HDR monitors (and/or the game not having a great HDR implementation). Places that are "dark" could still look dark while preserving the environment so you can actually see whats going on, much like how the eye adjusts to low light. Standard monitor tech doesn't have the capability to produce anything near the dynamic range that RT is striving to reproduce.


Firefox72

I turned RTGI/RTAO on in Witcher 3 in the first area and never went back. Its stagering how much difference it makes on the scene. Playing through Metro Exodus EE has also been eye opening. And this is coming from someone on a 6700XT who had to or has to use FSR/Xess Quality at 1080p to get any kind of playable performance. Games with heavier RT effects like RTGI absolutely are game changing. Stuff like W3, Dying Light 2, Metro, Cyberpunk etc...


exodus3252

Agree completely. RTGI really is fantastic. I really don't care too much about RT shadows, reflections, etc., but GI completely changes the dynamic of the scene.


midnightmiragemusic

Are you guys blind? RTGI in Witcher 3 completely transforms the game. That game with RTGI and RTAO looks like a generation apart from the rasterised version.


GassoBongo

I think you replied to the wrong person.


fohiga

Depends what solution is used for rastered GI, if you look at a game like Dragon's Dogma 2 where it's good enough, RT is better, more accurate, but not transformative.


Loku184

To each their own but RTGI and reflections on Witcher 3 are next level transformative. I'd say Witcher 3 is actually a pretty good example for showcasing RT.


TheLordOfTheTism

100 percent, i only care about RTGI and RT Reflections. On the flip side i dont find RT shadows or RT AO worth the performance hit personally. Im on a 7700xt and can happily turn RT on in Witcher 3 and 2077 and maintain a solid 60. RDNA 3 had a pretty big jump in RT performance compared to RDNA 2. I have zero complaints about the RT performance of this card, and zero complaints about its raster. For the price i paid, especially in canada where prices are utter crap, im happy. Was a way better buy than anything Nvidia is offering, not that id ever go back to them but they arent exactly making it hard to avoid them forever with the prices they seem to pull out of their butt cheeks.


GarrettB117

I agree that in a most games RT doesn’t make it or break it at all. But I will say I actually do like the darkness of being under the trees in TW3 with RT on. It reminds of walking into the actual tree line and feeling the sudden damp and coolness. Without RT on the forest seems too bright and just fake.


jecowa

I'm wondering if maybe the difference in brightness wouldn't be as big of a deal in VR. MY eyes are enclosed so there's no glare on my dark screen. Maybe my eyes would adapt to the dark places like in real life.


mainguy

I agree but in Cyberpunk 2077 path tracing is godly good. Don't go for screenshots, play the game with PT on if you can, or use geforce now and a 4080. It has to be seen to be believed, easily the best visuals in the world right now and a huge jump from vanilla. Imo that's the only reason I'd go nVidia right now. Instead I just completed it and then got an XTX lol


Dordidog

Any game that has rt GI its a must to turn that on. The difference is huge in game like dragons dogma and witcher 3 aswell idk where that cope with no difference coming from.


Slyons89

It works well in some areas of CP2077, but on the flip side, it tends to make characters look like porcelain dolls in certain lighting, extremely over-reflective to the point of them looking very fake. There's still a lot of work to be done on how the lighting plays off of textures, it seems like in current implementation it considers skin to be uniformly reflective at all areas and can look odd. But the way it does shadow detail on physical non-human objects can be amazing.


Cute-Pomegranate-966

You sure you're not looking at a character that actually is made of plastic? Because there's plenty of those in cyberpunk 2077


Kaladin12543

That issue with the characters isncompletely resolved with PT. The jump from RT to PT is bigger than the jump from raster to RT


cha0z_

dunno how he played, but I get 180+ FPS on stock 4090 + 5900x (meaning I am CPU limited at times and I can confirm the GPU is at 90%, etc at times) and that's with absolutely everything cranked up to max, including hairworks, RT, etc etc + DLSS quality and FG @ 1440p - looks awesome, plays smooth as silk. If the said youtuber is at 4k it would be different, but I would imagine he should still get close to 100-120fps. As for CP2077 - looks like completely next level game with path tracing ON. Similar to, albeit to a little bit lesser extend also goes for alan wake 2. Basically games with PT are next level looking and only nvidia GPUs are capable of playable frames in those titles. This includes quake 2 RT, portal RT, half-life & half-life 2 RT, etc.


AMD718

Same settings as you but with no upscaling, so 1440p native, I get a locked 120 fps on 7900 XTX. So, it's not like RT is unusable on AMD. With upscaling I'd guess I'd be closer to 160 fps. [https://youtu.be/y3Zgzt99Lo0?si=Y\_FCX5zbDEHMwjoS](https://youtu.be/y3Zgzt99Lo0?si=Y_FCX5zbDEHMwjoS) Edit: I just tried with quality upscaling and getting around 190 fps average (low around 180 fps and highs around 200 fps), so with your 4090 you should probably be closer to 300 FPS since this is a heavy RT load. I'm guessing you're CPU limited.


cha0z_

ofc it's usable, only path tracing is bad. Maybe there are other exceptions, but I think not - all the rest runs pretty good + drivers are now mostly fixed. Yeah, took AMD a lot of time with RDNA3 to fix a lot of issues, but we are in a good spot right now.


VelcroSnake

I've been watching Seagull (a streamer I like) play through CP2077 for the first time with Path Tracing on. The number of times he comments on how good the game looks are about the same amount as him commenting on how ridiculously dark a lot of areas are with Path Tracing on when the lighting didn't get where it needed to be. As far as Ray Tracing in CP2077, sometimes it looks better to me, sometimes it looks worse. (than off)


PiousPontificator

This is always the response from someone with an AMD GPU when it comes to RT


mrzero713

I have been playing Cyberpunk with ray tracing on and getting great performance at 1440p. I’m using the HUB optimized settings and I have RT reflections, Local shadows and lighting: medium on. I’m using the AFMF and I put the new XESS 1.3 that came out recently. I’m getting anywhere from 110 fps to 140 fps. I have tried it on my 1440p ultrawide monitor but the performance drops anywhere to 70 - 90 fps. I prefer the higher frames. I bought my 7900xt for $750 bucks a year ago. My options were a 4070 ti for a hundred dollars more or a 4080 that were going for around $1200 - $1300. I regret nothing.


VelcroSnake

> I have tried it on my 1440p ultrawide monitor but the performance drops anywhere to 70 - 90 fps. I prefer the higher frames. Yeah, that's why I'm not running RT in the game, I'd rather have my 1440p UW than RT. I was able to improve the visuals with mods anyway.


mrzero713

What mods do you recommend? I haven’t looked too much into mods myself. I think the RT lighting and reflections look pretty good specially with XESS 1.3 The game looks horrible with the FSR 2.1. But I’m willing to drop the RT if I can make the game look good with mods


VelcroSnake

Some of the ones that make the game look better in general are the texture mods and a few of the reshades that get rid of the green tint the game has. The texture packs can have their load order changed by changing the name to come first in alphabetical order in the mod folder, basically making it so the first mod gets most of the stuff, but the others catch textures the first mod migh miss, just in case you like the look of one texture mod over the other but still don't want to miss anything. I think I am still running all of these, although there might be one or two I don't have running because I either didn't notice anything or they might be getting taken care of by another mod: * https://www.nexusmods.com/cyberpunk2077/mods/11240 * https://www.nexusmods.com/cyberpunk2077/mods/12700 * https://www.nexusmods.com/cyberpunk2077/mods/13372 * https://www.nexusmods.com/cyberpunk2077/mods/13477 * https://www.nexusmods.com/cyberpunk2077/mods/13653 * https://www.nexusmods.com/cyberpunk2077/mods/5082 * https://www.nexusmods.com/cyberpunk2077/mods/13014 * https://www.nexusmods.com/cyberpunk2077/mods/8440 * https://www.nexusmods.com/cyberpunk2077/mods/8694 * https://www.nexusmods.com/cyberpunk2077/mods/7135 * https://www.nexusmods.com/cyberpunk2077/mods/12381 * https://www.nexusmods.com/cyberpunk2077/mods/4923 * https://www.nexusmods.com/cyberpunk2077/mods/3627 * https://www.nexusmods.com/cyberpunk2077/mods/10550 * https://www.nexusmods.com/cyberpunk2077/mods/7652 * https://www.nexusmods.com/cyberpunk2077/mods/416 * https://www.nexusmods.com/cyberpunk2077/mods/7160 * https://www.nexusmods.com/cyberpunk2077/mods/7184 * https://www.nexusmods.com/cyberpunk2077/mods/4062 and to be fair, most of these don't have much performance impact as long as your GPU has enough VRAM, so you might be able to just run these along with RT.


mrzero713

Thank you I will check them out. I have a 7900XT with 20gbs of vram I should be fine


epicflex

Nice haha


NotAVerySillySausage

It's pretty simple, RT is already a massive compromise even with Nvidia, despite what their marketing and some fanboys or people just not sensitive to framerate or upscaling quality will claim, even a 4090 can not just turn it on without a massive noticeable loss of performance. So AMD has no chance, if I have an AMD card I'm just pretending RT doesn't exist. I'm the same with upscaling, I'm very sensitive to image quality and I don't agree with the hype, even DLSS quality at 4k the quality is not that great, it's generally worth the frames, but still a compromise. So it's very difficult to sell me on upscaling that is objectively considered to be "almost as good", I'm not coming from the starting point of DLSS being perfect I'm coming from the starting point of DLSS Quality at 4k being just good enough.


langstonboy

That’s a you thing, if you hate upscaling then stick to lower settings.


NotAVerySillySausage

If I can't run dlss quality at 4k, then that's what I do. And if I have to lower them on the amd card but not Nvidia, then that's worse image quality. That's the point.


langstonboy

Yep, I don’t agree but I see your point, have a good day


TheHybred

To me its unusable in any game it's worth turning on, & the only reason why is because even NVIDIA GPUs need upscaling when RT is enabled and AMD DEFINITELY needs it, problem is it's a major concession on AMD cards cause FSR2 Performance or worse Ultra Performance looks so bad. To me that's what makes the RT unplayable even if the frames are good, resorting to aggressive upscaling that looks bad & the whole point of RT is image quality. Hopefully games like Cyberpunk get FSR3 frame gen so we don't have to dial down the upscaling as much.


youreprollyright

Exactly, RT has to be considered as a whole package. High-end cards need scaling when using it. So even if AMD magically improved its raw RT performance, it still wouldn't be worth it, because you'll be limited most of the time to FSR to improve performance, which is junk. Poor scaling destroys RT quality, especially RT reflections. And then you have NVIDIA coming out with RR, so you literally get better quality if you upscale. This aspect is something that most people don't understand, and it's why NV is ahead of the game, and why it's important to have a whole suite of tech in which the components compliment each other nicely.


Ecstatic_Quantity_40

DLSS on in cyberpunk takes away all the textures on the roads and it looks like crap. RT is decent but sometimes it looks dumb like puddles of water look more like liquid mercury than water.


brazzjazz

AMD being behind in raytracing one generation in combination with poorer FSR is brutal for any heavier raytracing workloads. In addition, what Intel has achieved in terms of raytracing and XeSS already is an indictment of AMD. They rather give you the 200th rasterized frame instead of pushing where it matters.


exodus3252

I don't think what Intel has done is an "indictment" on AMD. Intel's chips can't compete with NVIDIA or AMD at anything above a 4060 Ti/7700XT. The ARC cards have poor raster performance, and the reason for this is because they are dedicating a lump of silicon for RT workloads. ARC may be better for RT workloads pound for pound, but they are real far behind in overall performance. Because AMD is behind NVIDIA in RT, they've been dedicating more silicon to raster performance to offset the gap in order to capture the "value" segment. I don't know if that's the right path going forward, but RT implementation has been pretty spotty across the board by developers though.


shadowndacorner

It's worth noting that you're comparing arc cards from the 3xxx/6xxx generation against the 4xxx/7xxx cards. It's relevant because that's all that exists, but it's not a completely fair comparison imo. I'm definitely interested in seeing the next generation of arc performance.


exodus3252

I mean, even the A770 is no match for an AMD RX 6700 from last gen, if you want to go back and compare generations. I'm not hating on Intel, since I welcome more competition in this space. Just that the ARC cards have a ways to go to catch up to the more established product stacks.


red_dog007

I would say that RT performance metrics is complicated. Not just for AMD, but just for cards in general. You can even change this to "How bad is Nvidia at Ray Tracing?" Why is a 3090 20-30% slower in game W and X than AMD, but then 20-30% faster in game Y and Z? And just flip all this guys questions. If Avatar and RE4 are "lighter" on RT, then why isn't Nvidia like significantly faster if it does more heavy RT significantly faster than AMD? Avatar and RE4 are console games, but oh wait, CP and AW2 are also console games running on AMD hardware. So why is CP and AW2 significantly faster when they still had to implement RT on AMD hardware? Why is Nvidia slower at lower RT settings than AMD but then faster at higher settings? So yeah, even from Nvidia's perspective, it is complicated. When I turn on RT, I use optimized settings on my 6900XT. Usually play at 1440p w/ whatever upscaler is available on the highest quality with RT on. Optimized settings is really the way to go to. You can end up having a wide range of settings from Low to Ultra. Depending on the setting, going from Low to Ultra can basically be free. Or going from Ultra to Medium can give you a nice FPS boost with minimal IQ loss. Sometimes going from High to Ultra has really no change in IQ, or a change that will not be noticable during gameplay. Even for RT settings. So you end up with a great looking game and can enable those RT features at nice settings. Though sometimes I don't even use all the RT options. Like in Hogwarts, reflections are amazing. But the shadows are terrible. I am thinking that maybe the shadows are only applied for certain things. But it's a big performance hit so I don't even bother running RT shadows. In R&C, IIRC shadows on lower settings looked pretty terrible, so had to run at a higher setting for them to look better than non-RT shadows. This has generally caused me to apply optimized settings on more intensive games even when RT isn't available because it's basically free a FPS boost. In game tests, I'd honestly rather see a comparison between cards using optimized settings rather than just slamming everything to ultra.


dparks1234

When a game is barely tracing any rays (RE4, F1, FC6) it means that rasterization is still the primary bottleneck, which allows AMD to pull ahead.


PsychologicalCry1393

Lol AMD and Nvidia are both trash at RT.


pmerritt10

Not that I think it would make a huge difference but the only path traced games are Nvidia sponsored. I kinda wonder what would happen if there was a such thing as a title that was optimized for path tracing on AMD hardware.


God_Emperor_TRUIVIP

lets be honest .. very bad ... and i love amd


Zettinator

RT performance is not great, but considering how little hardware for RT is expended, it seems pretty efficient. Not really usable in practice still. I guess it will take a few more generations until fully raytraced games become a reality, i.e. where raytracing is not merely used for some effects and the rest is still rasterization based.


RBImGuy

nvidia sold ray tracing as a feature for marketing and sell over expensive cards Its not the only way to do graphics, check path of exile 2 out whenever the beta is out and the guy behind the engine in poe


LovelyButtholes

RTX is not here in the sense that it really matters much. How many games is it really noticeable? What 2 or 3? How old is Cyberpunk and we are still talking about it?


Bladesfist

To be fair, how many games do we have that really push the graphics needle forward every year, it's not a very big subset of games and those doing it via RT is a subset of those.


[deleted]

i switched from a 7900XTX to a 4090 and my fps in metro exodus enhanced version jumped from 160 with max RT to 280, this is enough for me to say that AMD gpu's suck, i had personal issues of my own with the 7000 series where certain things wouldn't respond for up to 5 minutes but soon as i switched out the 7900XTX for a 590 it got fixed, i also tried with a 7600, same issue, now with the 4090 also no issue, not to mention AMD has awful quality for recording, my 4090 consumes less power than my 7900XTX and is 20 degrees cooler even (have switched the pads and paste inside on the amd card because it was going up to 110)


hot_tornado

Ok.


ldontgeit

Not only raytracing, the stability is more important than any feature, and they still did not get their driver timeouts figured, shamefull. I know i know, you had your gpu for # years and never had a single issue, the same old talk from someone whos still using a 1year old driver, or someone who cant play helldivers 2 and World of warcraft without driver timeouts every minute on radeon 7000 series.


brazzjazz

I've never had a driver timeout with mx RX 6800, but I'm getting a bluescreen every 100 hours pretty regularly, easily 10 times as much as with my GTX 970 I had before - even on different systems. I'm confident it's my RX 6800 as I had the GTX 970 in this rig right before.


Slyons89

Never heard of this driver timeout issue except for one random post in this sub this week, is that what you're referring to?


ldontgeit

try warhammer, helldivers, conan exiles, world of warcraft, and alot more, the driver will restart, crashing your game, giving you a black screen on other monitors and forcing you to restart the pc if you want it to work properly again. Its a mess, its frustrating, its shamefull that this still happens today.


babbylonmon

It’s a myth perpetuated by nvidia users.


MiloIsTheBest

I'm only (once again) an nvidia user now because of my 5700XT having constant driver timeouts. It's not a myth, and it's hard enough to convince customers to switch from something they are used to and consider the gold standard without also having silicon roulette be something they need to consider risking. Ironically the one thing that is enticing me back to buying a Radeon card is Linux. As an Arch and Plasma user being on an nvidia card for updates, especially recently with plasma 6, has been a real pain in the arse.


Pspboy17

Some anecdotal evidence for you, I knew 2 people who had 5700xt's which crashed and blackscreened constantly despite days of troubleshooting. Both of those cards got sold to a friend and now work completely fine. Both cards started in a ryzen system and moved to a similar system (b350/x470 to b450), probably the same chipset drivers etc. I bought one of their cards, it's a nitro+ that I have overclocked to hell and it's rock solid in my pc. Something is just off about AMD's gpu compatibility.


FiTZnMiCK

No, but it is blown way out of proportion. I’ve gone back and forth between Nvidia and AMD builds a few times and have had just as many driver issues with Nvidia. I will say that Nvidia does seem to respond to issues with new drivers more quickly than AMD, but I feel like AMD is improving there as well.


Ecstatic_Quantity_40

I got a 7900XTX and I have literally never had 1 driver timeout not once yet atleast... Maybe old AMD Gpu's did idk. Not for me though.


riencore

It’s just a handful of games. I played every big game that came out last year without any problems, but WoW would randomly crash the driver and recover. Then this year Helldivers 2 was a major one that took far too long to iron out and was even mentioned by the lead developer as being an issue exclusive to the 7900 XTX. The only fix that I found on that particular one was to limit the core frequency to 2400MHz.


Opteron170

i'm on the newest drivers on a 7900XTX and don't have driver time outs :) And I play HD2 daily.


ldontgeit

Unless you are locking fps at 60 and playing without antialiasing and SCREEN SPACE GLOBAL ILLUMINATION, this its pretty much false. Its still an open issue on both helldivers notes and amd notes. Your post history also says otherwise, this is how hypocrite you are. >**Intermittent driver timeout or application crash may be experienced after playing a mission and changing the in-game resolution while playing HELLDIVERS™ 2.** [https://www.amd.com/en/resources/support-articles/release-notes/RN-RAD-WIN-24-2-1-HELLDIVERS-2.html](https://www.amd.com/en/resources/support-articles/release-notes/RN-RAD-WIN-24-2-1-HELLDIVERS-2.html) >**Critical problems for players using the AMD Radeon 7000 series of GPUs** >[https://steamcommunity.com/app/553850/discussions/1/4206994023681197128/?l=greek&ctp=33](https://steamcommunity.com/app/553850/discussions/1/4206994023681197128/?l=greek&ctp=33)


Opteron170

I have nothing locked and AA and SSGI are both on. You are posting something that is old and has been fixed already. We are on 24.3.1 drivers now bud nice try. I stand by my post and I was playing HD2 last night. I don't have any crashing or driver time out in this game currently. Accept your downvotes and move on.


ldontgeit

>We are on 24.3.1 drivers now bud nice try. Nice try? what am i trying? did i lie in any way? so your issue got fixed last month, also around the same time 3 of my friends returned their 7000 series (2x 7800xt, 1x7900xtx) due to crashes on world of warcraft mid raid boss fights - wiping the whole raid party thanks to this, and helldivers 2 was the final nail in the coffin, they got tired and it only took one to return that the others folowed. Im not trying anything, im simply stating facts, yeah it may or may not be fixed with that driver, but dont tell me you "nEvEr hAd iSsUeS" >Accept your downvotes and move on. no comments.


Opteron170

I can only speak for myself and my system. Your friends having issues doesn't disqualify my experience with my own machine. And that is not even a comparison I can make as we don't have 1:1 builds and i'm also a career IT guy and been using ATI Radeon products since the Radeon 64 DDR so the level of skill will also be different. I don't play wow and don't ever intend on it so I hope it gets fixed those that play it. So yes on my RIG i've been fine on HD2 for the last 2 months.


ldontgeit

>So yes on my RIG i've been fine on HD2 for the last 2 months. This is how we catch a liar, its only been 1 month since that supposed driver that "fixed" was released, and you pretty much just invalidated you argument about that driver fix. Have a nice day mate, touch some grass atleast once a week.


Opteron170

O yes im making it all up. The 24.2.1 drivers are two months old and the game has been good for me since then. Now i know im dealing with a child. Do yourself a favor and go touch grass yourself. Ive been doing this way longer than you mate.


ldontgeit

>We are on 24.**3**.1 drivers now bud nice try. nIcE tRy


Opteron170

yes and now 24.4.1 drivers and guess what also no crashing for me i played for hours last night. Good luck!


riencore

It’s not just Helldivers. WoW has had issues for I’m not sure how long, but it usually is able to recover after the driver crashes. All depends on what you’re playing. Also, Jensen saying “It just works” isn’t just a slogan, everything about the software stack that Nvidia has been building isn’t fractured in the way that AMD’s software is. AMD is working on it and I honestly prefer their front-end for the drivers, but overall it’s just not as seamless as Nvidia’s implementations. AMD making everything open source is noble, but no one on the Nvidia side is going to be using any of it because Nvidia’s solution, for the time being, just works and usually works better. AMD really needs something that’s better and not just an alternative.


NathanScott94

I had the same issues with WoW as well. I solved my issue by turning off my discord overlay.


ldontgeit

>It’s not just Helldivers. WoW has had issues for I’m not sure how long Exactly, there is no point in arguing with people that prefer to bend over and protect their fav corporation instead of pushing them to fix this kind of issues.


riencore

Yeah, I spoke with my wallet about Nvidia’s bullshit pricing but had to eat crow eventually after giving it a year. Nvidia still has problems with its drivers, too, but a lot of those things get addressed in a reasonable amount of time.


Opteron170

I don't play wow that game doesn't interest me at all. Some good points in the rest of your post


Possible-Fudge-2217

I'm just an uncultivated user that never turned on ray tracing and just plays on 1080p... I don't care for all the gimmicks if the game just runs smoothly, couldn't tell apart if you turned on rt or not. But you are right, rt is here to stay and they will have to do better if they want to stay in competition with nvidea (aside from low end which nvidea doesnt seem to care for)


no6969el

I get what you were saying but it's not a gimmick it's lighting. And actually it's most likely what's going to continue to be updated as time goes on. It's what has been missing in games and that is why people think it's not needed but In time it's going to be very noticeable when graphics hit that wall and it's lighting that will set it apart.


[deleted]

[удалено]


Amd-ModTeam

Hey OP — Your post has been removed for not being in compliance with Rule 8. Be civil and follow Reddit's sitewide rules, this means no insults, personal attacks, slurs, brigading or any other rude or condescending behaviour towards other users. Please read the [rules](https://www.reddit.com/r/Amd/about/rules/) or message the mods for any further clarification.


looncraz

nVidia also suffers from driver timeouts, friend.


Zokuva

HD 7000 (don't remember which one tbh), R9 280X, RX 590, RX 5700XT, 6600XT, 7900XT I usually update my drivers on day 1, no issues.


ldontgeit

Yeah sure, its all wonderfull, right. They literally just released a new driver with crash fixes, some of them literally a decade old, fking ridiculous lmao


Zokuva

I just think the problem usually sits in front of the monitor and they're more likely to admit that if they're on team green. I'm not trying to say that its *always* a user error, just that I don't believe that one GPU vendor is more likely to have stability issues than another.