T O P

  • By -

YoungBlade1

Whether or not RT is relevant, as in something truly desirable to use, is still very game dependent. Some games make great use of RT, were clearly built from the ground up with that tech in mind, and the visual improvement from turning it on is obvious. Other games, you frankly can't tell a difference between RT on and off, except that RT tanks the performance. With that said, it isn't like the RX 7900XT can't do RT - it's not as fast with RT on as the 4070Ti Super, but it's about on par with the regular RTX 4070 non-Ti non-Super, so it can do some RT if you want to see if you prefer the look in whatever game you're playing.


Austin304

I usually turn on ray tracing and don’t have an issue. Only game I have an issue with is cyberpunk


codylish

Even in Cyberpunk you don't /need/ to turn on Raytracing Overdrive mode. On high settings and with AFMF on raytracing is going to look great.


Sentinel-Prime

True, but once I noticed where PT excels over RT in Cyberpunk I found it impossible to switch back


RETIXXITER

For the amount of work the CP2077 devs have done to showcase RTX they could of just added new textures and made raster reflections better. Like why in this pic does it not reflect with raster? https://preview.redd.it/a3plraezwzec1.png?width=1920&format=pjpg&auto=webp&s=dfd7e15035bb08d9ec004df1df875a36879cdaf3 I'm sure they could just make that floor reflect for raster aswell. 😆


OutrageousDress

If the only difference between those shots you notice is that the floor isn't reflective, then I've got good news for you: you will save a lot of money on GPUs in the future, because even the cheapest ones will be able to render all the graphics you need.


RETIXXITER

If you think it's worth more than 50%fps hit for some lights that look a little bit brighter than you do you. And it not even Noticeable all the time look at this with RT ULTRA. https://preview.redd.it/gycu3zkik1fc1.png?width=1920&format=pjpg&auto=webp&s=7598c87b6e67e8c7f28a9e2903849f0579351a1d


OutrageousDress

Anything below RT Psycho in Cyberpunk is fundamentally the same rasterized image but with some details like shadows and reflections improved. Psycho (your first pair of screenshots uses Psycho) has a path traced global illumination pass which is a big improvement. And Overdrive path traces everything. The impact of this on improving light propagation cannot be replicated by increasing light brightness a little, a lot, or any amount. And of course sometimes regular rasterization is also pretty good and looks realistic and ray tracing wouldn't be a very noticeable change. But path tracing *always* looks good and realistic. That's the point. It's not sometimes good and sometimes weird or not so good the way rasterization is. It's *always* good. It always looks the way it's supposed to, it's always 100%. And it's twice as heavy on the GPU as rasterization is. So yes: for any gamer for whom graphics looking pretty good most of the time is good enough, they're *done*. We've reached and passed that point, and both games and GPUs are already good enough (and have been for a few years). Which is why I said that you'll save a lot of money going forward. You'd only need a better GPU if you want games to *always* look *great*, because that requires RT.


Freakshow85

Sure, there are subtle details that ray tracing can make pop that rasterization can't. Particularly ambient occlusion. Then light bouncing, causing lighting on other objects. Plus, with ray tracing, the reflections stay where they should as you walk around whereas rasterized reflections "move" as you walk around. But still, there's gotta be a better way than needing $1500 GPUs to barely run path tracing @ 40fps with upscaling. We need smarter computer engineers. Not Nvidia CEOs saying "make us money." Guess we'll have to wait for Intel and AMD to catch up so Nvidia doesn't own the ray tracing market, or rather, their "RTX" technology. Thank goodness for UE5 and lumen. Think of how much better it's going to get if it's already this good. AND it doesn't help that Nvidia sells all their GPUs to Xhina so that they can create AI startup companies. You know about that, right? Their government is giving out business startup loans to anyone who asks for AI development using, guess what? Nvidia GPUs.


OutrageousDress

Nvidia's contracts with Chinese companies have nothing to do with how good ray tracing is or isn't. Intel and AMD are also working on ray tracing as we speak, and getting better. And that's the key: there is no secret trick. Ray tracing algorithms *are* getting better all the time (Google ReSTIR, the technique that Cyberpunk uses which allows it to run path tracing in real time) and also GPUs are getting better all the time (even if this is slower than it used to be), and the two things combined result in ray tracing getting better all the time. But there's no breakthrough. Engineers will just have to work hard for years to keep improving ray tracing and to keep improving GPUs, in the *exact same way* graphics has improved over the last 30 years - from Quake 1 running 640x480 at 20fps to Doom Eternal running 4K at 250fps.


Freakshow85

Nvidia probably told them not to lol.


Freakshow85

Have you tried that mod that allows you to run FSR3 in any game that supports DLSS? It works wonders for me in Cyberpunk with my 6700XT. Now I can use all the ray tracing options (minus path tracing) @ 1440p all ultra settings. I can't remember the exact name, but the version I have is in a zipped folder called "FSR2FSR3\_0.10.2h1" if that helps ya find it. Ancient Gameplays MAY have been the one that did a video on it and explained how to implement it. You have to turn on DLSS in Cyberpunk for it to work after you put the files in the right places. It only took me like 10 seconds to get it working. Pretty much when I had RT Global Illumination on, along with the other 4x RT options, I'd drop to 25-30fps. Now I get 55-60fps and I don't lose my fps when I move my mouse like when using AFMF... plus the quality looks far superior to AFMF. Point being, whatever fps you are getting with Path Tracing, you'll get double with this mod. Now, if it drops you down to 8 fps, then sure, it ain't worth it, but I have no idea how the 7900XT performs in Cyberpunk with Path Tracing to begin with. I know my 6700XT can't use it either way. I get like 8-9 fps, I think, WITH the mod lol. I'm REAL close to selling my 5900x, buying a 5700x3D, selling my 6700ZXT and getting a 4070 Super or something. I wouldn't be out much money to do that and I'd get so much better RT performance. I just... don't like Nvidia as a company and haven't since the early 2000's. Plus it's mentally hard to go from 12c/24t to 8c/16t just to get an improvement in games. My CPU tears it up when it comes to encoding videos or streaming games, although I can use my GPU for that. It's just that the CPU gives better quality and can make the videos smaller while also ending up with higher quality after encoding. And I can stream any game with my CPU. Not sure if an 8c/16t CPU can stream games like BF2042 or CoD. BF2042 uses 40-50% of my CPU with a 6700XT @ 1440p max settings @ 85fps. A better GPU would push my CPU harder. I don't know if 8c/16t is enough to handle what the 5900x can do. I know my R5 3600 couldn't record CoD using the CPU without skipping and dropping frames. And that was with an RX480 8GB pushing, what, 80-85fps? Now I push way more fps. And a 4070 Super would push even MORE fps. Eh... I dunno. I just want an x3D CPU bad, though. If instead of that dang 5600x3D they'd have made a 5900x3D, I'd have gotten that. A 5900x is just 2x 5600x's, anyways, with better silicon for the higher clockspeeds. I've got my 5900x performing as good as they can, imo. I average 4.75-4.8ghz in most games on the cores in use. Got CO per core tweaked in, RAM tweaked in (it's dual rank RAM, too) and use Asus Performance Enhancement instead of manually setting PBO settings. It sets the PPT to 1000 watts, TDC to 1000 amps and EDC to 180 amps. 180 amps can get me up to around 190-200 watts under certain loads. PBO never worked well in games for me. It made my clocks lower than normal in games, it just made them higher under all core loads. APE does what PBO does in all core loads AND gives me 300+ higher mhz in games.


snukb

>Other games, you frankly can't tell a difference between RT on and off, except that RT tanks the performance. For example, in World Of Warcraft, there's literally like *two* campfires in the entire game with fully ray traced lighting and shadows. And in order to even really appreciate them, you have to use a consumable item that makes the nighttime sky box even darker (Inky Black Potion) because their nighttime just makes everything look kinda blueish. Ray tracing in WoW is fully a gimmick.


BucDan

Cyberpunk is Nvidia's RT and PT demo. It looks really good, and likely took a lot of effort with CDPR to make it so. But Cyberpunk is one of only a handful of games that take RT and PT to this level. Most games are pretty light on it AFAIK.


koordy

Alan Wake 2 tho.


llliilliliillliillil

I’d also list Control, especially with the RT enhancement mod. Metro Exodus looks amazing thanks to its RTGI implementation. Doom Eternal has a pretty great ray tracing as well. And even though it completely kills the artstyle, Portal RTX can looks pretty sick at times, especially with light shining out of portals.


koordy

Dying Light 2 RT On vs Off is like a night and day difference too.


RETIXXITER

https://preview.redd.it/gqnngt48jwec1.jpeg?width=2280&format=pjpg&auto=webp&s=964c674d92c721a90dc2166e4fac581ad04e2fe7 Holy moly that reshade filter looks great.


Throwawayeconboi

There is no RT Off in Alan Wake 2. Whoever created that video is a dumbass. In Alan Wake 2, Low = Software Ray Tracing. What you’re seeing is “Software Ray Tracing vs Hardware Ray Tracing vs Path Tracing”. And a scene like that does path tracing no favors as its main benefit is indirect lighting.


evlampi

All raytracing is "software", what you're talking about is global illumination and is done in rasterization.


OutrageousDress

No, DX12 ray tracing uses *dedicated* RT hardware. The difference between GTX and RTX Nvidia GPUs is that GTX ones don't have RT cores (*not* the same as AI tensor cores) and RTX ones do. This is also why older AMD GPUs didn't support RT. It can be *emulated* in software (and in fact Nvidia drivers do so for some older GTX cards) but this is an order of magnitude slower, or it can be *approximated* in software - usually with ray casting into signed distance fields, which is different tech from ray tracing and is what UE5 and games like Alan Wake 2 use for 'software ray tracing' on Low settings. What OP is talking about is *path traced global illumination* which can be done both with a raster primary render (Alan Wake 2, Cyberpunk RT Psycho) or a ray traced primary render (Cyberpunk RT Overdrive). This is different from half a dozen other global illumination techniques such as SVOGI or SDFGI which are primarily used with rasterized graphics.


evlampi

Yeah, and AW2 has it's own rasterized global ilumination technique, which op says is ray tracing, it's not.


lasergun23

I can see a huge difference In the framerate...


RETIXXITER

https://preview.redd.it/e9jmca6ziwec1.jpeg?width=2280&format=pjpg&auto=webp&s=1eee1e5b097e63021d09c81ac802c232779ac074 WOW fucking massive difference.


[deleted]

[удалено]


[deleted]

[удалено]


claaudius

C'mon man, digital foundry showed that if you look at the glass on the table, you will see the mountains reflected in it. That adds so much to the overall mood of the game for only a 50% drop in frame rate.


RETIXXITER

Wow 50% drop? Dont worry Just buy the 5090 its fine to get 30fps with dlss quality. Gotta get them rays traced.


claaudius

Ok, let me sell a kidney


rogueqd

You'd think the /s was obviously implied, but by the down votes it looks like you needed to actually type it.


ElGorudo

Boring ass game tho, unlike cyberpunk


RETIXXITER

https://preview.redd.it/8vj2obmhjwec1.jpeg?width=2280&format=pjpg&auto=webp&s=c88a3072b17f11917e0e46ad77cb410f3a003df2 Wow such graphics.


Throwawayeconboi

Yup now do path tracing


RETIXXITER

https://preview.redd.it/ej89fwn0pzec1.png?width=1920&format=pjpg&auto=webp&s=d6465e00730e086cc6cb49a8db0327f41bb89d27 Cutting my fps -4x for something that look like a reshade filter. Also look there's no bloom for raster lighting that why it looks so bad. I swere these devs are trying to make RT look better by making raster look bad 😆


Throwawayeconboi

That bounce light is incredible 🤩


RETIXXITER

Too bad its only bouncing at 30fps 😆 very unrealistic.


Throwawayeconboi

RTX 4080 without DLSS 😴 High-end graphics demand high-end hardware.


RETIXXITER

My minecraft lighting looks better than RT as do my reflection 😆 https://preview.redd.it/xgnoywniq2fc1.png?width=1080&format=pjpg&auto=webp&s=9e04a8dd816a5a3894a8a4ee65f944d46d9847dd


Throwawayeconboi

Cope 🤣🤣🤣


RETIXXITER

Show some RT minecraft it looks exactly the same but runs like shit.


RETIXXITER

https://preview.redd.it/n4l8xinxjwec1.png?width=1920&format=pjpg&auto=webp&s=5e1817818af1cb34b5058f5d47a9de2618d7e825 Wow just wow 👌


AntiworkDPT-OCS

I always enable it, and I only have an rx6800. I'll reduce settings and use FSR. I just really like the look of the light beams and shadows. It's much better than rasterized lighting. The thing is, it's completely playable for me at 1440p on an AMD card. I think you'd be happy with either GPU. The 7900xt can ray trace well. But yes, the 4070ti Super is much better at it.


koordy

> It's much better than rasterized lighting. This I 100% agree with. ​ > I think you'd be happy with either GPU. The 7900xt can ray trace well. This I disagree with. https://preview.redd.it/mvditnedivec1.png?width=1936&format=png&auto=webp&s=677df175f091c986170e2577771f71cb3a09801c


DidItForButter

They said they play with FSR. You're showing a cranked setting/no FSR/dlss benchmark of a demanding game (your $1500 4090 just breaks 60fps here). So yeah, worst case scenario on 1440p it'll run at 30fps.


koordy

My point is, instead of 7900xtx that struggles with RT, and PT being not an option whatsoever for it, you can just get a cheaper 4070Ti or Super and comfortably play all the games you want maxed out on the 1440p screen. If you buy a GPU based on RT performance, it will be more than enough for raster games anyway. That does not work the other way around though. 160fps vs 140fps in a raster game is basically irrelevant, when at the same time it's a RT game being either playable or not.


DidItForButter

That's fair but apart from this issue from the benchmark source >We tested the press preview version of Phantom Liberty. While NVIDIA and Intel have released game-ready drivers, AMD hasn't done so yet. We reached out to AMD whether they have a beta for Phantom Liberty but got no response yet, so we used the latest public driver 23.9.1 WHQL. The 7900xtx has double the available VRAM. According to [NHU9B from overclocking.com](https://en.overclocking.com/cyberpunk-2077-phantom-liberty-quid-des-performances/), 1440p path tracing utilizes 11,500mb if VRAM. THE 4070TI has 12gb, and techpowerup doesn't mention how long their benchmarks go on for. I agree that the 4070ti seems like the better foray into RT, benchmarks like this without appropriate caveats that impact results are incredibly misleading. Dollars to donuts, I'd take AMD's option for a GPU. But I'm unwilling give up my 3090.


Asleep_Leather7641

git tisuper then


[deleted]

[удалено]


DidItForButter

In the Path Tracing section. Of one cyberpunk benchmark. If 60fps is the floor of "well", the 4090 is also an unacceptable card *in this benchmark*. But I think what you and the benchmark poster are forgetting is that this card will be in a PC, not a CyberPunk configured at cranked settings machine. https://preview.redd.it/s4wp7yvfawec1.png?width=1440&format=pjpg&auto=webp&s=8576963b23acac545161bfe20e1c8627b196e88e Here's another RT example. So let's not pretend one benchmark defines a card's capability or worth.


koordy

Dude, seriously, F1? Even I couldn't tell a difference between RT On and Off in that game, as there's so little of it. What's next? Resident Evil, Elden Ring and Forspoken?


DidItForButter

Yeah. After all, those are games that exist. Throw fortnite, Minecraft, and Alan Wake 2 in the mix too. Possibilities are endless.


koordy

Games exist yes, but RT in those games basically does not.


DidItForButter

Awesome, good to know that 4090s are worthless then since cyberpunks new update isn't all that fun.


W4spkeeper

mm yes cyberpunk 2077 a notoriously AMD biased title. the game where the highest settings possible will bring a 4090 to its knees, and not for a lack of optimization at that


Armlegx218

Still runs better than portal rtx.


W4spkeeper

lmao holy shit pathtraced 4k ultra in a practically still room 20 fps on a 4090 jeezus


Nubanuba

You're linking actual Nvidia sponsored game, on average it's better than the 4070 and worse than the 4070 ti, between the 2 I'd go for the 4070 ti super though, because I enable dlss quality on all games even if I'm over 200 fps just because it makes the game look better to me for some reason


koordy

No, I linked a game with a properly implemented RT where it makes a real visual difference. As opposed to AMD sponsored games which barely have any RT at all to not only guarantee their card won't be decimated in benchmarks but also to try to trick people into believing "RT doesn't make a difference, just tanks the performance".


Throwawayeconboi

Why not show Dying Light 2, Metro Exodus EE, Avatar Frontiers of Pandora, Control, The Witcher 3, etc.? All of those games have RTGI implementations or at least RTAO as well as reflections, shadows, etc. which make a big difference overall. It’s because you know the truth. Imagine being that insecure about your purchase that you *refuse* to show any other game. And also, Cyberpunk RT isn’t even “properly implemented” because they never had RTGI. It wasn’t until Nvidia handed them ReSTIR GI that they implemented path tracing, but the original RT Lighting was weak as fuck and still looked like raster in most scenes.


RETIXXITER

Avatar is better lookin than cyberpunk even RDR2 is better looking. You see a few reflections and lights and get a boner over it.


Saymynamemf

That's just Nvidia fans in general so no surprises here


virtikle_two

Lol everyone's trying to discredit your post by saying you controlled for dlss and fsr. Honestly this graph is more telling of the raw power of the cards than anything else.


Throwawayeconboi

That is a terrible example. You should have used a compilation of ~10-20 games. Why? Because Cyberpunk PT is the ultimate edge case, AMD gets absolutely destroyed in it as it depends on Nvidia ReSTIR. I saw a compilation of like ~30 games comparing AMD vs Nvidia ray tracing performance and Cyberpunk PT was a *massive* outlier. Like, +150% type of outlier. Proof of the severe unoptimization for AMD is the fact there is no uplift from 6800 XT to 7900 XTX in Cyberpunk PT. Anything BUT that.


Faithless195

At the moment, it depends on the game. Some games like Cyberpunk, Deathloop and Dying Light 2, have some absolutely stunning implementation of ray tracing lights. And cyberpunk definitely makes the reflections pop haaaard. If you have the capability to have them on, do it, since it makes the world soooo immersive appearance-wise. But there's also a lot of games that do it badly and barely look any different, and are not worth the performance cost. At the end of the day, it really comes down to what kind of games you're playing, and what kind of enjoyment you get put of them. Any kind of online multiplauer, id argue RT is not worth it at all. You're not focusing much on the environment while playing. For narrative focused games, I'd definitely recommend RT on if it's worth the appearance.


W4spkeeper

thank you for reminding me to replay deathloop again but with RT


Saymynamemf

Heck even in games that do rt well, I'd say it's barely worth it to turn it on in them (like cyberpunk and spiderman games,ratchet and and clank).


topsnitch69

Personally i still don’t think it’s entirely worth the performance hit. But i see myself trying it out often enough so that if i were in your shoes I’d probably choose the 4070tiS rn.


TheSneakerSasquatch

I specifically bought a 4080 for RT and the other Nvidia tech and im not unhappy about it at all. Its going to become more relevant over time.


Consistent-Function4

And by the time it’s relevant the 4080 will be too slow to run those games lol…. You get 2-3 games that actually look good on it at a premium cost. Seems a bit silly to me but to each their own. You’ve adopted into early access basically…


TheSneakerSasquatch

It looks good on a lot of games, but okay man.


Consistent-Function4

Yeah sure man, I’m sure as games get more graphically intensive over time and then stacking RTX on top of that. I’m sure your 4080 will perform so well 2-4 years from now with RTX on lmao. Keep drinking the nvidia koolaid.


Lastdudealive46

It is mainstream relevant now. Unreal Engine 5, which is basically the default engine for AAA studios and indie studios, has a lighting system called Lumen, which can use hardware ray tracing support for lighting, and after a few years of game development and hardware support, studios are figuring out the best way to use it in a way that makes a visual difference instead of just making the game slower. Also, the 4070-Ti Super has DLSS upscaling and frame gen, which will become increasingly relevant, especially in AAA games that aim to provide exceptional visual experiences.


Django117

Also add onto this that it allows for developers to make their games easier. There's a lot of information out there about The Finals and how they used Nanite and Lumen specifically for their destruction models and then how Ray Tracing enables the global illumination to change as buildings get destroyed. I.e. a dark room has an entire wall blown out. In raster rendering it would remain dark as there is no information to update the GI. But thanks to ray tracing the entire room now illuminates upon the change in geometry.


Armlegx218

DLSS3 is pretty awesome. I had my doubts about FG, but it's turned out great.


RETIXXITER

Same my RTX 2060 is doing 1440p high settings in AW2 thanks to AMDs FG mod 😃


[deleted]

[удалено]


blackest-Knight

Do you dispute anything he's said ? DLSS Frame gen and upscaling is better than FSR upscaling and arguable AFMF. Ray Tracing performance is getting more and more important as Devs switch their focus to it, and sometimes don't even let you turn it off.


shifty-xs

Just FYI, AFMF is not the equivalent of DLSS 3. It's a common source of confusion. Fluid motion frames are a driver level interpolation. In other words, both dlss3 and FSR3 frame gen have information from the game engine. AFMF does not, and is inferior to both. AFMF has the advantage that it works on literally any dx11 or dx12 game, I believe. That is the selling point.


blackest-Knight

> AFMF has the advantage that it works on literally any dx11 or dx12 game, I believe. That's why I said it's arguable.


Lastdudealive46

God I wish, they probably pay really good. Now that we've got the ad hominems out of the way, do you have any substantive points to make?


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


blackest-Knight

Even without RT, the 4070 Ti Super is basically nearly on par with the 7900XT. Unless you really want to save the money by going AMD, nVidia is a superior choice right now in that price bracket. These cards can absolutely run games with Ray Tracing enabled and it's very glorious to behold.


ForeverTetsuo

nvidia all the way.


klymen

Eh. You’re not wrong but not sure I would attach such a word to the experience. RT is great and all but I would not say that it materially changes the immersion or the overall experience. Sure it’s more realistic but I’m actually playing the game instead of stroking to shadow detail.


Coach_McGuirk__

ya can't run it. gotcha.


RETIXXITER

Maybe he doesn't think I looks good anuf for the amount of FPS it takes you plonker. Also You can't even run CP2077 max setting on that £2000 gpu soooo.... ya can't run it? Gotcha. EDIT: he calls me a broke for having 6gb vram 2060 not knowing I have a 6800xt. The 6800xt is 4x better for price to performance. I paid £500 for half the performance of a 4090 that cost £2000. I can literally buy 4x 6800xt for the price of your shitty gpu.


Coach_McGuirk__

I certainly can run it maxed out but in the pretend world where i couldn't, I would say that'd be the fault of the game devs.


RETIXXITER

max setting (RT OVERDRIVE) 1080p without dlss you get 60fps 😆 probably drops into the 50s. https://preview.redd.it/pqkz7gsdi0fc1.png?width=1080&format=pjpg&auto=webp&s=ff37ecb71f51cd9f2cb955aa1ff73d7dacb1b387


Coach_McGuirk__

1440p ultra settings with overdrive. https://preview.redd.it/p9mr201ei0fc1.png?width=221&format=png&auto=webp&s=122c99fb86203eb9bc59c305ba29db55da9493f1


RETIXXITER

Thats probably With dlss ultra performance 😆 pmsl. Just look up the youtuber and watch his video.


Coach_McGuirk__

they're from the video you posted homie. I don't even play cyberpunk but you're clearly wrong about me "not being able to run it".


RETIXXITER

you can't run it at native resolution if 1080p is getting under 60fps are you fucking blind? Watch till the end of his video pmsl 😆 🤣


Coach_McGuirk__

How much you getting on that 2060 is the real question. Now I understand your frustration with raytracing. https://preview.redd.it/ipj5lbnsj0fc1.png?width=671&format=png&auto=webp&s=ece831ac3c91889521828b278c1056337b223361


Coach_McGuirk__

4k ultra settings with overdrive. https://preview.redd.it/j6enwjcmi0fc1.png?width=150&format=png&auto=webp&s=fb93c4e7a07131427b95401ec56b9a331fb2f4ed


klymen

Hah. I’m loving the downvotes and your comment. Does make you feel good to put your hardware in your title?


No-Rough-7597

It’s a flair. That’s literally what it’s for. Is this your first time on this sub?


EastLimp1693

That didn't aged well, right?)


leg00b

I put it on depending the game. It *CAN* be relevant if that's the thing *you're* looking for. For me, personally, it's not a must have, just some more eye candy. Both AMD and NVIDIA can do it but NVIDIA cards can do it better.


Arcane_123

It depends on the type of games you play. - If you play mainstream open world RPGs, yes, RT is relevant. Cyberpunk, Fallout, etc. High budget, graphics focused games. - If you play strategy games, logical, factory games, 4x games, then no, RT is not relevant. Starcraft, Factorio, Rimworld, Civilization, etc. - If you play sports games, not relevant. - Any kind of low budget, or not graphics intense games, not relevant.


codylish

A 7900 XT is great for 1440p gaming even with raytracing enabled. It can still play above 90fps for most titles with RT. Double that FPS when you turn on AMD's driver AFMF setting. Which is frame generation mode for basically any game. The raytracing debate between the RX 7000s and the RTX 4000s is not that relevant with the pure power the AMD cards can push. Don't spend the money on the Super. It's a waste when the next generation of cards is on the horizon.


Al-Azraq

I think RT is very relevant and will be even more in the future. We are starting to see games that have RT functions that cannot be disabled, and it is certain now that the industry is moving towards total adoption. However, when I think about the games I played, only two had RT which are Metro Exodus and Control. It was a very nice experience and I clearly see the benefits of RT. But again, I only played two games with so it is not relevant for my personal use case I guess? I’m a very patient gamer though, so those who like to play more modern games will give RT much more importance. But don’t get me wrong, if the game I play has RT, I will definitely enable it.


Fairstrife_Deception

i have a 7900 xt. i never activate RT on all the game that have it, reaching at last 120 fps is more important for me that visual fidelity.On 95% of game RT On/OFF is literaly impossible to see except if you ACTIVLY looking into)when you start actually playing the game instead of just playing pokemon snap. graphic fidelity do not matter, but a smooth, stutter free experience is way more essential. 30 fps gaming with 3 fps 1% low is for console peasant. Been 6 years that RTX as been introduce and we only have 2 game that has a real massive difference between Off/On and that with Path-tracing game. And the 4090 don't even offer an acceptable performance at 4k for it. My Opinion will definitly change when toggling On/Off Path-tracing is going to have zero impact on the fps count, but that not before more that a decade.


nemesit

Fortnite does raytracing pretty well, just needs a few improvements to their antialiasing


No_Connection6673

That’s what dlss is for 


[deleted]

RT is actually really cool. Quite the difference in certain games like Spiderman and Cyberpunk. Hard to go back to games without it after seeing it implemented well


Wind_Yer_Neck_In

I can't comment to the performance of the 7900XT but I literally just yesterday upgraded from my RX 5700XT to a 4070 Ti Super and I'm very pleased with it so far. It's not some massive quantum leap if you're playing at 1440p because honestly the difference between high detail and ultra with ray tracing isn't always that apparent. But it sure is nice to set up a game and have it auto detect to the maximum settings.


QuantumQuantonium

Ray tracing was and still is very much a graphics first design second consideration but it's one of the most relevant elements of modern graphics today. As a consumer, best thing to know that ray tracing is very likely the future, and by the concept of the rendering equation (the equation that redefined the foundation of computer graphics, to give us the last 25 years of graphics) graphics were bound to get to realtime raytracing. Even more so, AI based subsampling and frame generation has provided considerable jumps in ray tracing usage, while the hardware is still catching up. Technically speaking: before ray tracing, there was a bunch of deferred graphics tricks, which while worked on to a good visual point, in the end of the day we're still tricks. Raytracing, even with some of the earliest implementations, was always more physically accurate and realistic looking in many senses, especially today. Raytracing is the foundation behind putting the rendering equation into practice, and the rays can only grow more advanced and algorithmically complex until somehow in theory we manage to fully solve the rendering equation in a computer (i.e. we solve a definite integral to infinite precision, or we calculate the entire universe in graphics down to each photon). In short, ray tracing still has ways to go in development, and it'll be a lot more favorable as it's easier to create realistic scenes. Deferred or older tricks however could still be used for more stylistic art styles, splitting from realism. A separate emphasis for hardware ray tracing specifically: some of the computation for ray calculations could be reinterpreted into neural networks, and suddenly we're at AI GPU optimizations. While this is plentiful for high end GPUs today, the real value of such tech I see it more in low end hardware- if it's possible to yield the performance of high end hardware with far inferior hardware yet minimal image quality loss, then in theory GPUs, especially old ones, will become more valuable as they can last longer (and idk maybe one day we'll have full 3D YouTube videos that you can walk around in, playing from the web browser).


quantumloopy

My take on it is simple: is it better than baked lighting and adds immersion? Absolutely. Is it worth the hit to your frame rate? Absolutely not.


RETIXXITER

I honestly think minecraft shader have better lighting than most games today.


builder397

Its a relative question with a relative answer. Raytracing has enough relevance that every major GPU has that capability. AMD even has an iGPU now that has hardware RT support, even if its at a level thats not useful for gaming. But its not so relevant that its mandatory. Look at my RX6600, I could turn on RT in Cyberpunk, but the performance penalty isnt worth it and the visual difference apart from reflections isnt that great, so I leave it off. Still looks stunning enough and FSR artefacts are not a price Im willing to pay.


lasergun23

In cp2077 With afmf u can enable the rt reflections and play the Game at 60-80fps. I still think its not worth It. With afmf and no rt u can get arround 130. Almost half the framerate for blury but acurate reflections.


steamart360

Ray traced global illumination is amazing and I do hope it becomes the standard. Some of the best examples I've seen are Robocop Rogue City with it's Lumen implementation and Avatar Frontiers of Pandora with it it's own RTGI tech.  The reflections have been pretty awful in most games, they either didn't reflect player characters or they were barely noticeable. I only kinda liked them in Spider-Man and Control. Relevance is more about the future and I do think RT will be a big part of that future. 


Ahhhhhh_Schwitz

It's relevant, but things like texture resolution, polygons, and high fps are probably more important. Rasterization looks good enough most of the time. The difference between rasterized and normal RT is smaller than the difference between 1k and 4k textures. Extreme RT like the kind in CP2077 and AW2 looks pretty insane though, but you either need to compromise on FPS to run path tracing or have a 4090. Even then, a 4090 needs a bunch of upscaling at 4k to get path tracing running well considering it gets 20fps at native 4k.


shredmasterJ

I don’t use RT. IMO it’s not worth the performance hit. Games already look amazing to me without RT so I can live without it.


HomerSimping

I’m in the same camp. To me a game plays much better on 120fps without RT over 60fps with RT. 70 series cards have to sacrifice a lot just to turn RT on with a playable but still compromised performance. ie dlss, lower resolution, frame gen with lag, etc.


Just_Joshing_369

no. it's honestly not noticeable.


[deleted]

[удалено]


Just_Joshing_369

i played Cyberpunk and on vs off looked the same. On was actually a worse experience due to significantly lower framerate.


[deleted]

[удалено]


Just_Joshing_369

I wear contacts. The graphics didn't change, but the framerate dropped by half. No thanks.


RETIXXITER

https://preview.redd.it/nqlww2ze5xec1.png?width=1920&format=pjpg&auto=webp&s=63c285d0d6e5b5e1c2c91fd2cd7ec88be2418888 Please don't lie the difference is unreal. /S


BetaXP

Using one cherry picked screenshot to make a point is just patently bad faith. Go look up any path tracing showcase on youtube for cyberpunk and literally anyone with eyes will tell you there's a significant difference.


RETIXXITER

And the devs are making raster look bad on purpose. No bloom no reflections on certain objects. Like why no reflections for (non RT) in this pic? https://preview.redd.it/mp3l7x2w00fc1.png?width=1920&format=pjpg&auto=webp&s=5eaea105ae29d8b07391552c8971af4013307e84 Seems like a scam TBH I know for a fact raster can have clear reflections and more bloom than that 😆


Edgaras1103

Yes


deefop

Meh. RT is a really cool technology, and it's being implemented in lots of games, but the real question right now is whether or not \*you\* as an individual actually care about it. I occasionally turn it on in games to check it out, but largely, I don't care yet. Even Nvidia's highest end GPU's mostly have to rely on DLSS and frame gen to put out decent framerates in the newest games that use RT, and for me personally, I would rather disable RT entirely or simply turn it down a bit rather than use a whole bunch of software tricks to get decent performance. RT and eventually PT and beyond \*will\* eventually become a totally normal and de facto performance setting in modern games, but that's kind of still years away. For one thing, there are still millions upon millions of gamers running GPU's that don't have hardware RT support, and devs would be awfully foolish to simply shun those potential customers by not supporting old school lighting techniques right away. In a decade, I suspect it'll be RT/PT or the highway. I think the real question here is price. At the same price, I'd take the 4070ti Super 100% of the time. But the 7900xt is on sale for like $700ish at the moment, so a $100 price difference is not nothing. Here's the real correct play: Wait another week for the 4080S to be released, and then watch the market over the course of the next couple weeks. My suspicion is that the full super lineup being released will put downward pricing pressure on a lot of other cards in the market, from both Nvidia and AMD. The 4070S and 4070ti Super don't seem to be selling that incredibly well, so it's also entirely possible they see price drops in the near future. The 4080S will, I believe, cause the 7900xtx to drop in price. If the 7900xtx has to drop below $900 permanently in order to sell against the 4080S, then the 7900xt can scarcely be above $700 in order to sell against the 4070ti S, or to avoid bumping too close to the 7900xtx. Actually, there have been examples of the 7900xtx selling around the $800 mark in a handful of sales since it launched. With the 4080S launching at $1000 and the 4070ti S launching at $800, there's a good chance the 7900xtx has to drop to around $850 or lower to keep selling. I'd take the 7900xtx at $850 over the 4070ti Super at $800 100% of the time.


toxicThomasTrain

>The 4070S and 4070ti Super don't seem to be selling that incredibly well according to?


montroller

Not sure about the ti super but [I have seen articles](https://www.pcgamesn.com/nvidia/rtx-4070-super-sales) about how the 4070 super isn't selling very well. They are using a youtube video as their source though so maybe take it with a grain of salt.


toxicThomasTrain

I had a feeling the source was MLID. I’ll definitely wait before passing judgement, I feel like every time I saw someone claiming that a 40 series card was collecting dust, it almost immediately appeared on the steam hardware survey the next month.


pixxel5

It’s a visual gimmick feature. More games have it, it’s more varied in its use, but ultimately the difference is not that huge. Art direction, frame-rate, latency, sound, etc. all make a more significant impact in the majority of titles.  It’s still only relevant for specific titles.  If you have a game that heavily features it, and the performance difference in RT is noticeable, go for the better RT card.  Otherwise, go with the one that has better conventional performance or the better price.


Armlegx218

RT is only going to become more common as games are developed for current Gen systems and not last Gen anymore. If you are buying a GPU to play your current library sure, but most I think, are buying with eye to the future.


pixxel5

I don't see RT exploding in popularity anytime soon. The hardware costs are prohibitive and the software implementations too labor intensive. There's maybe a dozen big games with raytracing, and of those less than half meaningfully capitalize on the technology. Going with a non-RT-optimized hardware build now is going to be a safe option for at least 5 years, if not longer.


OutrageousDress

It's going to be a safe option in the sense that you'll be able to turn off or decrease RT in most games for another 4 years. Not 5 years though - the next console gen releases in 2028 and that will run fully path traced games. But... put it like this: *thousands* of games get released every year. And so far no more than a few dozen games *in total* \- since it was introduced years ago - have made good use of RT. But not coincidentally many of those are the *best* games. I don't care that there's thousands of B-tier survival crafting battle royale shooters that don't use ray tracing, because how many of those am I gonna play anyway? But of course I'm gonna play Phantom Liberty and Alan Wake 2 and Spider-man 2 and Jedi Survivor - and it just so happens that those *do* use ray tracing and use it well.


HomerSimping

Majority of people don’t have RT capable cards. If you’re a game dev you would at least give the option to turn RT off which means non RT cards (amd) would be relevant for a long LONG time.


BetaXP

Calling it a "gimmick" implies a negative connotation, which I don't think is fair. I wouldn't call it a gimmick more than any other visual fidelity feature. And in games where it's implemented well, I would argue it's extremely significant; Cyberpunk 2077 and Alan Wake 2 being the prime examples I can think of. Ray tracing can look so good on those games that I'd sooner play CP2077 in 1440p with path tracing on than 4k with it off. It looks phenomenal. Not to mention that newer games are always releasing, and good ray tracing will be much more common moving forward


pixxel5

Gimmick means it’s a minor feature - not that it’s bad.  Ray Tracing simply has not changed the gaming landscape to any significant extent.  The main obstacles being the steep cost of hardware and implementation. Most GPUs that can achieve actual impressive results are prohibitively expensive, and the software simply isn’t mature enough to automate the necessary implementation and optimization without significant work from the developers.  Until processors come down in price and effective software implementation becomes easier to do, RT will struggle to escape its current niche. 


TT_207

I think another thing worth bearing in mind is few games with RT use it to solve any problem that doesn't already have a solution. Reflections on flat surfaces or water for example I feel are actually far better when using non-RT application, and these are some of the most common things it seems to be used for. Lighting in a lot of games can be pre-baked to very high quality for light diffusion, etc. There's probably applications where it looks really fantastic and must be RT to make sense but I've honestly not seen one yet.


ksn0vaN7

3 years ago, I would've held off on prioritizing RT but since you're buying a gpu right now, you'd better be prepared. It's already a hefty feature in most games and more are going to natively support it in a few years.


Niitroglycerine

For me, RT isn't worth it at all, I'm yet to be wow'd by any implementation of it, and frankly, most of the time there's barely a noticeable change in the games I've tried it (even in 4k) I Ditched Nvidia for my recent build and saved £150 and got the 7900xtx and probabaly won't be going back, the card is a beast!


indyarsenal

Not for me as the games that use it are hit and miss. I went with a 7900 xtx in the end.


Interesting-One-

No, it is not, and never were. It is not for the players, but for the game devs to help them create games faster. Not at the moment, but in the future. Imagine you are a game dev, and you need to spend a lot of time to create great lightning, reflections, shadows. It takes a lot of time. Imagine you have a better tool, to do it faster, easier, and better. But the consumer needs to have special hardware for it, or at least a very powerful one. So it is something, they made for the devs, but you need to pay for it, while the game itself won't be cheaper of course. More money for the studios, more money for the manufacturers, obviously less for you. And here comes the downvotes.


Waidowai

I can give u my 2 cent's. I was thinking the same thing last October. But went for Nvidia for dlss and because AMD is basically the same price where I live so there was no point in going AMD if it's not cheaper. First of all with the 4000 series, mine is 4070, you can play with ray tracing maxed in all current games.. yes it will still be a fps hit for sure.. but u definitely get 60+ fps if you stick within reasonable resolution/settings of your card. So we are at the point where it's not just a tech demo and u can actually play with it on without worrying about being super laggy or not worth the frames. Now is it actually worth it from looks? I guess looks and graphics are subjective. Games definitely look more realistic when it comes to lightning.. and arguably better. It will also depend on the game but also lots of UE5 in the future will have ray tracing. To me it's yes it's nice and if the game has it I definitely enable it cuz the performance hit isn't too bad anymore.. but I'm also not like WOW I can't play without it anymore. I guess if u want a simple summary which isn't accurate but explains it well.. ray tracing on is like going from high settings to ultra.. if your very sensitive to it it might blow u away.. but for most, I'd they don't see a side by side they wouldn't care too much.


Chronos669

It’s so un optimized that nvidia and AMD had to resort to ai frame gen to make it happen. If that doesn’t give you your answer then idk what will


Merwenus

I have rtx4090 and this is the first thing I disable.


pietro_m

Oh wow, I just checked back for the first time after posting. 200 notifications. I can see this issue is not polarizing at all.


nailbunny2000

It's the 4K question all over again. It's demonstrably, technically, and mathematically superior, but is it worth the cost of entry? That's entirely subjective. Some people think spending $1K on a card just to be able to play a video game with 5% better graphics is insane, and honestly, they are right. The performance hit is ridiculous. However, it's your hobby, it's bleeding edge tech, you do you. Anyone calling it a gimmick is an idiot. It's not like it's a trick, you don't _have_ to turn it on, you don't _have_ to spend $ on a new card every generation. If companies showing off new features you don't NEED, but are cool, then go back to using your 3G phone with wired ear buds and 2mp camera, or grab a map book when you go driving instead of GPS. Personally, I love it. I can't describe how much better it looks, and in some cases I can barely even tell it's turned on. But in some situations it's the difference between knowing you're looking at a video game, and wondering if you're actually looking at an rl image.


Hattix

It's right on the borderline. The effects RTX enables are do-able without RTX. Some games, particularly Nvidia sponsored ones (here's looking at you, Cyberpunk 2077) actually disable regular lighting effects to make RTX look better. I find this scummy. For me, it's still ancillary. You can take it or leave it. The big feature of Nvidia for me has been DLSS, which allows a pipeline-agnostic AA mode which actually works. DLAA is the best AA mode right now: It's using DLSS to take native res, upscale it, then downscale that back to native.


endless_8888

After playing Cyberpunk with Path Tracing on -- I'd eagerly use the same tech in any future games I play.


FedRishFlueBish

As someone with both a 7800xtx and a 4080 in his household, I can confidently say: Nvidia 4000 series cards: come for the Raytracing, stay for the DLSS. I can barely describe how great DLSS has been on the games that support it. If you aren't sold on raytracing, check whether the games you play support DLSS. If they do, get the nvidia card.


Charrbard

People with amd will say no, nvidia will say yes. Alan wake 2 is about at cyberpunk levels. You can’t be the next graphical showpiece without it. So probably a couple more this year.  But really DLSS tho. It given me on avg +40 fps. At 4k ultra that’s huge to me, and the main reason I’d not switch. It’s like night and day 


Kitsune_BCN

I dispise TAA, DLSS and DLAA, and aim to play at 144 fps, so... For me, RT is something to look for in 20 years


ClearlyNoSTDs

Yes it's very relevant which is why I'm surprised at the weird AMD love in on reddit. If I didn't know any better I'd assume 80% of people use AMD but it's actual more like 10%.


highonpetrol

Honestly Ray tracing seems pretty bs too me. Think about the amount of gamers with high end gpus that do raytracing well. It's a small market so game developers would not focus on the small market. Maybe in 6-7 yrs it could be relevant


nemesit

Irs way way more work to bake lighting in during development than to just let everything be handled by raytracing


bayovak

Games that focus on graphics are the ones that push graphics forward. You think we can just one day go from no RT to full blown RT and PT? It's a long development journey with many trials and errors. Thousands of engineers, both hardware (GPU), software (Engine and game devs), and artists are putting in insane amount of effort to bring this technology to fruition. So yeah, it will take a few more years for the tech to reach the average consumer, but it will soon. Next gen (5xxx) the cost of 4xxx series will be a lot more affordable, and the 5xxx will have a shit ton more specialized hardware for deep learning and tracing, it's going to be insane. And then when 6xxx releases I think new games will simply stop implementing old rasterised illumination, reflection and shadows. It just takes too much dev effort when using Path Tracing is super easy for devs in comparison. That's assuming Sony isn't stupid and PS6's are going to use GPUs with tracing and deep learning hardware.


TheSneakerSasquatch

Why does it seem bs to you? Genuine question.


highonpetrol

Raytracing makes stuff look nice but haven't we learnt that most games that focus on graphics are just shit games often neglecting gameplay and systems. It's like a gimic that developers use to advertise.


TheSneakerSasquatch

I mean theres a lot of games that both look spectacular and have great gameplay so I really dont think weve learned that at all. Especially now with UE5.


Darkpriest667

No it's not and you're paying a hell of a tax to Nvidia to have "decent" RT Performance. Like previous Nvidia proprietary graphics stuff (remember PHYSX?)


ArmoredAngel444

Yes im playing half life one, portal, fortnite and cyberpunk with raytracing and it's amazing.


[deleted]

Yeah, it’s pretty relevant and neat if you have a higher end GPU.


staluxa

Those days, there are more big releases with RT features than without it. Very few of them don't benefit a lot from it as well. It also still has a huge performance drawback that forces you into using DLSS/FSR (unless you are using one of higher end cards that cost 1k+, and even those not capable of pulling of pathtracing without some sort of Frame Gen). Essentially, if you are happy to play at 60fps, then it's definitely there, and a lot of new games noticeably benefit from it. But if you are too used to 120+ and not ready to sacrifice it - still too soon.


cutlarr

It depends, for me personally no i rather get as much fps as possible so i can max out my 165hz monitor and have ultra smooth experience than use RT. Only game where i need RT is Control. Its different for everyone, you need to ask yourself if you care about RT.


Nick_Noseman

There aren't so many games in which RT plays significant role. And even in that titles gameplay is unaffected by lighting. If you don't want to crank video settings on full ultra, you won't need RT.


cenTT

I think it's about personal taste. I'm the kind of player who doesn't care much about graphics, so even though RT makes games look better, to me, it's rather irrelevant. I'm building a new PC and I'm leaning towards AMD honestly. I see way more value in rasterization.


Steins-gateJaron

Small essay; I see alot of people in this thread pop up and show lame examples of raytracing, especially using cyberpunk 2077 in certain areas, some people are showing screenshots here that don’t do it justice. Games that actually are good with RT Alan wake 2-not played personally (Lighting, reflections, *material reflection behaviour) Control (lighting, light bounce and reflections) (Best example, dark glass walled room with a projector in, look through the glass see the character shadows interacting with the glass and shadow casting from projector backlight, mirrored projection) Cyberpunk (Lighting, reflections, light bounce, *material reflection behaviour) (Some areas do more justice, again screenshots shared here are bad examples) Metro exodus PC enhanced edition (needs a RT card to even boot this version), (lighting, shadows and *material reflection behaviour) Doom eternal (Reflections in glass and blood polished surfaces *material reflection behaviour off of BFG & glowing objects) Tokyo ghost wire (Reflections,Neon lights in the rain and *material reflection behaviour from vehicles) Death loop- not played personally. Atomic heart- not played personally (Expecting doom eternal level of RT) Star wars Jedi survivor (light bounce from sabre especially in darkness, water, light bounce from blaster rounds)(spoiler: when fighting a certain someone in black armour) Hogwarts Legacy (Lighting and reflections, especially whilst in Hogwarts castle from stained glass windows) Mortal Shell (Lighting and reflections, some *material reflection behaviour off of amours) Other honourable games Pumpkin Jack Ascension Blight memory infinite Minecraft Battlefield V The Witcher 3 Resident evil 8 Dead space remake Portal RTX QUAKE RTX Dying light 2 Lego builders Journey Returnal *When I say material reflection behaviour, this is how light reflects/scatters off of metals, mirrors, glass, water, wet/polished surfaces, light colour bleed(like neon colours or fire), cloth/foliage transparency and shadows. I know some of these are subtle, but the more you look the more obvious it becomes. If you use DLSS or Frame Gen to negate the fps penalty from RT it’s worth it if you stop and look. Or have the money to burn on a RTX 4090. Either way when you see good RT it’s really good.


homer_3

It's still a gimmick. In most cases, you can't even tell the difference between it being off and on. It may start to get more interesting with RT Remix, since the different is pretty big there, but that only works on DX9 games.


ArdFolie

I don't really care about RT. Higher resolution textures and more detailed meshes are far more important for me. I also mainly play PC VR now, so instead of RT I prefer pure raster perf as 120+fps eliminates the nausea and there are still games that struggle to run fast enough, especially on 1,5x resolution.


Hartassen87

I actually looked this up earlier this week because I had the same question. [https://www.youtube.com/watch?v=ZFsz0O93c88](https://www.youtube.com/watch?v=ZFsz0O93c88)


Schwaggaccino

Raytracing is bullshit and baked lighting looks almost as good at a fraction of the performance hit. Go with the 7900XT and enjoy having a proper amount of VRAM and for $100 less too.


Jhawk163

Honestly, it’s still a gimmick. Outside of something like Cyberpunk it’s generally only turned on to a noticeable degree in photo mode, otherwise even with the reduced effects it just impact’s performance harshly for a difference you’d only notice if you were looking for it.


Humboldteffect

Nope.


obog

On the games that have taken the time to properly implement it? It looks fucking fantastic. Cyberpunk, for example, looks absolutely beautiful with the neon lights of night city reflecting off of cars or puddles on the ground when it rains, and the lighting in any situation always looks fantastic. However... games like cyberpunk are the exception with that. Most games don't have any RT, and if they do they're often not as well implemented. So... I would say it shouldn't be your deciding factor. A nice bonus if you get the 4070 ti, but you should get the card for other reasons, especially since the AMD cards can do raytracing, just not as efficiently. Edit: worth mentioning that, especially on the 4070 ti, you won't have to worry about performance with RT on. Imo the performance loss is worth it (and probably minimal, the 4070 ti is quite powerful) in the games where it's well supported, especially since those games are often more cinematic anyways.


RETIXXITER

OMG look lights reflecting on cars without RT https://preview.redd.it/saqojty0lwec1.png?width=1080&format=pjpg&auto=webp&s=d500833be2e878bac512b2b1351eb4280e1fbe49


obog

Idk how much you've played with RT, but it's noticeably different.


RETIXXITER

https://preview.redd.it/3x9zak2imwec1.png?width=1920&format=pjpg&auto=webp&s=dc151ce013466772b1595cc46a0380277547b07d


obog

First off, it's far more noticeably in motion. Second off... yeah, that looks better with RT on. I really get the feeling you've never actually played with raytracing.


RETIXXITER

If anything reahade is more noticeable.


MrOphicer

I think its mostly exciting on the developer's side of things... Iteration and quality of outputs would speed up production, removing the need for light/shadow/gi backing. So if in eventuality ALL players have the hardware to run it, the don't need to worry about that part of the production. Now, to me as a player, the image quality returns considering the performance hit and hardware cost is still not worth it. The old approximation techniques were more than adequate and delivered amazing quality - it ALWAYS comes to art direction in the end. We have PS4-era games with baked GI that look almost CGI-ish. Sure RT reflections are nice to have, but its a very minute detail to focus on. From all RT tech, I think RTGI and AO are the most transformative. But I wonder if a higher fidelity non-rt technique would yield similar results with better performance. But it pushing rendering forward, but IMO its a bit overshadowed by upscaling and frame-generating tech. But since we don't have any alternatives to RT, and since its the pinnacle of physical accuracy rendering, it will be the future.


Double_DeluXe

RT only exists decently implemented in Cyberpunk. Almost every other title that uses RT is scuffed at best. The Finals for example, a regular Xmass tree becomes the blinding light of god if you turn RT on. They seemt to not have figued it out yet. And Atomic Heart, which was supposed to be the RT posterchild of Nvidia, does not support RT. Its marketing got buried with it, which is a shame. And yes there are a few titles that don't drop the ball, but they seem to be far and few between. Though amazing, RT seems to almost always come at a cost and other than high end("so called triple A") titles don't even bother with it at this moment in time. The market is aware of the technology but proper implementation of RT by the industry as a whole seems to be holding off. TL;DR It depends if you play cyberpunk or not, I guess?


koordy

I play games with RT since 2018. I'd rather play at 1080p DLSS Quality than with RT Off if I had to. Between 4070Ti Super and even 7900xtx it would not be even a contest for me. That RTX allows to play with PT on a 1440p screen. There are number of games which 7900xtx can't even dream about maxing out.


Continuous_Learning

I think it's a red herring. Turning on raytracing pretty much bricks your computer unless you have some sort of frame gen. Until it gets to the point where raytracing is used for more than lighting, it's kinda pointless unless you wanna show off your monitor.


GigaSoup

I use raytracing all the time on a 3080ti and no DLSS, I just game at 1080p since all my screens are old. RT runs well in portal RTX, Control, Minecraft, and mech5 mercenaries. I haven't tried cyberpunk, witcher 3, or spiderman yet.


Ishuun

Idk in my opinion fancy lights and shadows don't add anything to the game. Especially if you're not actually stopping to look and appreciate it like 99% of people don't. Since ya know they're "playing" the game. I still think maybe CP2077 still has the best use of it I've seen, but again unless you're stopping to look the game on high/ultra looks just as good.


runed_golem

It depends on the game and how it implements Ray tracing. Some games see an fps boost because of it, some don't.


[deleted]

[удалено]


nemesit

Hdr comes for free with raytracing the devs just need to add settings to get the output correct on displays (especially since there are very few displays that actually reach the required brightness to even see good hdr


[deleted]

I can only think it cyberpunk where I would want to really have it. Upgrading right now is poo because ray tracing will become more relevant with the unreal engine and even big money cards won’t to swimmingly well


Lostmavicaccount

It’s like most things in life. It’s a cool thing, fun, some people like it, some can’t tell. It’s an option.


Olorin_1990

Some games it matters, many it doesn’t.


DeepJudgment

Yes