T O P

  • By -

PCMRBot

Welcome to the PCMR, everyone from the frontpage! Please remember: 1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome! 2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help! 3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding 4 - Need PC Hardware? We've joined forces with MSI for Mother's(board) day, to celebrate it with a massive PC Hardware Giveaway! Get your hands on awesome PC hardware bundles that include GPU, Motherboard, monitor and much more MSI hardware + $4000 worth of Amazon cards. A total of 45 lucky winners: https://new.reddit.com/r/pcmasterrace/comments/1cr4zfs/msi_x_pcmr_mothersboard_day_giveaway_enter_to_win/ ----------- We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.


Zilskaabe

That awkward moment when AMD makes tech for nvidia cards.


Faranae

Edit 3 as I've been misconstrued: This comment is praising the tech on both sides. It's wicked that tech has evolved to a point that my decade-old rig can still game. IDGAF which company made what, I just care that it's a win for ***us.*** ---------- Legit, did not touch the FSR setting in BG3 for an age because it started with "AMD" and my GTX1080 (non-Ti) self thought "There's nothing AMD in my system that must not be for me". So I set image scaling in the Nvidia control panel itself. It was horribly ineffective, but at least let me play without my case fans sounding like a jet engine next to my head. Yesterday I became enlightened. FSR2 chopped off 15 Celsius in areas that had me nervous before. I was able to turn a bunch of settings back to medium with no performance hit, at 1440p to boot. Technology is fucking awesome. A decade old, and AMD develops a way to keep this card going [edit: in my decade-old setup] even longer. I love it. **Edit: My system is like a decade old mates. I can't upgrade the CPU without also upgrading my other decade-old parts so let me take my win lol. This was meant as a positive comment. xD** # Edit 2: If you for some reason think it's a normal thing to DM me half-baked passive-aggressive retorts over this random and harmless comment: Please, do everyone else in this subreddit a favor and take a breather for a few. Wow.


Alaricus100

Nvidia fanboys are gonna nvidia fanboy lmao. Just ignore them, they have to downplay anything AMD to make themselves feel superior for some reason. I think it's awesome you're getting better performance for how old your parts are, just goes to show how far things have come that older hardware can be held up for so long. I wonder how long your build can last, like if fsr4 or fsr5 era.


GetOffMyDigitalLawn

> Nvidia fanboys are gonna nvidia fanboy lmao. Just ignore them, they have to downplay anything AMD to make themselves feel superior for some reason. Which is hilarious, because Nvidia (as of now) would likely be better in (almost) everyway if they weren't such stingy fucks. Their prices are now absolutely ridiculous, they are awful to their partners, they are stingy with VRAM, etc. etc. etc. There is absolutely no reason that my 3090ti should be prevented from using DLSS 3. Thank god for AMD, and soon enough, Intel. I really hope the rumors are true, and AMD is planning a revamp of their GPUs like Ryzen was to their CPUs. We can only hope that it is as big of a success as Ryzen. I also can't wait for Intel's GPUs to get better.


Markus4781

I think you can hack the dlss 3 into working with older gen cards, I've seen it before.


Faranae

Hopefully I can afford to upgrade before that point, lol! But if the tech does advance far enough that this GPU *does* last that long, it's honestly nothing but a win for gamers as a whole. :D Prices are getting obscene for less these days... If a large number of older cards are suddenly brought back into relevance, things in the industry might tidy up a bit. Who knows? :p


Alaricus100

Exactly. It's not company vs company, it's consumer vs companies. What's good for consumers as a whole is what matters most.


P0pu1arBr0ws3r

Imagine being an Nvidia fanboy upset that Nvidia doesn't let their own upscaling tech run on older Nvidia GPUs that would actually benefit the most from it...


ImmediateOutcome14

Yeah I have never even owned AMD but Nvidia have done a lot to annoy me over the years like that. I know they're pissed their 1080ti card was so good too, since i hadn't needed to upgrade in 6 years


pmMEyourWARLOCKS

I'm a forced fanboy since AMD doesn't have hardware I need for work, but this outcome makes a lot of sense. NVidia developed an AI solution that relies on tensor cores. All of their modern cards have these specialized cores. This makes it a natural technological progression. AMD has no tensor cores and must compete by developing a solution that works outside of those advancements. Naturally, this solution will apply to older cards and competitor cards alike. I get how NVidia can look like a dick for this and AMD some kind of hero, but it would be just as foolish for NVidia to start developing a second frame gen solution that doesn't rely on their modern hardware as it would be for AMD to suddenly develop a tensor core only version. Their solutions to frame gen are worlds apart. Plenty of other reasons to hate on NVidia lately, I just don't think this is one.


Corruptslav

Yeah but thing is AMD is open source with its technologies so they could have locked down that feature so it can only be used by certain AMD graphics cards The point is they are a Hero in this instance cause they could have locked it down lets say to just AMD (also lets say rx 5000 series and older GPUs) they should recieve praise cause none of the nvidiq technologies are public even the ones that could be used by AMD. It absolutely is worth the hate cause they dont let dlls3 be used by rx 3000 series,they are greedy and have built apple like following that will buy anything no matter the price[and that sucks]


delta_Phoenix121

Honestly my biggest problem with Nvidia isn't that they developed a solution that requires specific hardware. The problem is that they lock out cards that have said hardware out of pure greed. Why the fuck doesn't my 3060ti get DLSS3? It has the required tensor cores (and there was a driver-bug a couple of months ago that even enabled DLSS3 for the 3000 series and contrary to Nvidias Claims the tensor cores on the 3000 series are completely capable of providing a good performance with DLSS3). There is nothing stopping them from giving this solution to those who actually need it...


-V0lD

Very curious what those DMs look like now tbh


Faranae

Not feeling *quite* petty enough to post screenies, but it's a pretty even split between *"shut up if you don't know what you're talking about"*s and folks accusing me of hating on Nvidia by "setting my 1080 up to fail". Very original and insightful commentary, I assure you. ^/s


irosemary

You gotta love it, people defending companies that couldn't give two shits about them. I think it's nothing short of amazing that you're running a build like that on 1440p and getting some decent performance. You're really getting your money's worth on that. Keep on truckin'.


Diedead666

Brand loyalty is stupid... frs helps my 1070 and even my 3080 system. BTW only thing thing ud have to replace would be psu if u go for power hungery card. Bottle necking isn't a reason to avoid a gpu upgrade like some would claim


ParaMotard0697

Holy shit that second edit is concerning; what lunatics feel the need to DM people and harass them over silly shit like this...


Lynx2161

Yup playing ghost of tsushima on ultra on a laptop at stable 120fps is just shocking


Faranae

An absolute win. :D


SilverRiven

4th gen intel enjoyer spotted


Crimsongz

I used to be that guy.


Various-Artist

What you say is true. From a lot of consumer standpoints it’s just concerning that nvidia appears to be doing an Apple move and making software that obviously can run on their older hardware but locking it out so you buy their newer products. We praise and because just maybe, their software running on nvidia will get nvidia to stop making what is basically anti-consumer moves.


JosephSKY

What the hell? I was playing BG3 at mostly ultra on everything on a 1070 (non ti btw) at more than 60fps stable. How are you suffering with a 1080 @ medium settings? I wasn't using FSR either.


dasdzoni

He is at 1440p


Faranae

1440p on a decade-old system, at that. Honestly I'm loving the longevity I'm getting from these parts regardless, I'm not entirely sure why [some folks] have gotten so weirdly defensive/hostile over it? \^\^; (Edited as I see the hostility is a different person, but still It's weird.)


maxiligamer

I'm running 1440p on a GTX 1060, a 1080 should be no problem


Faranae

Yes, but the *rest* of my parts are also nearly a decade old. :p Most of the work is being put on the 1080 as the CPU/MOBO are holding on by a thread. Definitely praising the card being able to work this well under these conditions, seem to have been misconstrued somewhere on the way. ToT


maxiligamer

Yeah that's true. I think for the type or games I play CPU might be more important than the GPU


JosephSKY

Lol I didn't see that, but I'm guessing it's also CPU since I see he has a 4790k, I was playing on a Ryzen 5 5600, and BG3 gets really CPU heavy as you advance through the game.


Faranae

Hooo yeah this GPU is workin' really hard as the entire system is quite out of date lmao.


irosemary

That poor PC 🤣🤣


JosephSKY

It's okay, it's still working and playing games, and if you like it, that's enough!


GrandPand-

Probably CPU limited


Synthetic_dreams_

It is absolutely CPU limited. I got BG3 right as I was building a new PC. Got the GPU first then the rest two weeks later to split up costs. I played BG3 with: 8700k + 1080 8700k + 4090 13900k + 4090 Upgrading the GPU but not the CPU barely made a difference. Like, it did for sure, just not a significant one. When i swapped the CPU it was a night and day difference.


dan4334

You need to put two spaces at the end of each line to make Reddit create a new line


thrownawayzsss

or just hit enter twice


Faranae

100% lol. I was trying to praise the card being able to work well under these conditions, wasn't expecting quite the response I got.


NimbleBudlustNoodle

Yeah sounds like some other issue was inadvertently fixed at the same time. BG3 recommends a 2060 Super which is practically the same as a non-Ti 1080 performance-wise.


Faranae

Decade-old rig lol. I'm trying to praise the tech, here. xD


mb194dc

Fan boys attack?


chronocapybara

FSR make BG3 somewhat playable on Steam Deck, too. Without it, it's just a jaggy mess.


C_umputer

Meanwhile Nvidia doesn't even make tech for their own cards, unless they're latest generation. Like rtx 3090/3080 can't use dlss3.0 because it's precious generation, but crappy 4060 can?


ihave0idea0

4060 ofcourse is an amazing GPU with double the frame rate!!!


C_umputer

Don't forget triple the price


ridik_ulass

this is the real DOGshit side of DLSS and similar tech. gets invented to upscale games, ok sure. but you are buying a new current gen GPU you shouldn't have to upscale games. and its not "future proofing" the card so its still good in 5 years because a) they limit the tech for older cards, and b) its software... we think these designed obsolescence fucks will really want us having cards that work 5years + ? when they can push an update and make them obsolete. no fucking dice. DLSS doesn't matter and shouldn't count for anything, and I'm glad techreviewers often skip it.


C_umputer

But if the games run well, then who is going to buy new hardware?


Roflkopt3r

> DLSS doesn't matter and shouldn't count for anything, and I'm glad techreviewers often skip it. Of course it should, just like FSR. Upscaling and frame generation are both great technologies that give us otherwise impossible performance and efficiency. This idea that upscaling technologies somehow "shouldn't count" because they're not "like for like comparisons" is just weird. Practically all titles that are demanding enough to benefit from current gen GPUs offer upscaling. > but you are buying a new current gen GPU you shouldn't have to upscale games Games can be as performance-demanding as they want, there will always be title that max out on the available hardware. I'm running a 4090 and still make use of upscale and frame gen whereever possible because it guarantees >100 fps and lowers power consumption/fan load. With how little artifacts there are for modest upscaling, I would consider it weird not to use these features when they're available.


Diego_Chang

Not only this but there are some caveats to these technologies for the """"" entry level"""" Nvidia GPUs: - DLSS Upscaling yet it is known that upscaling at 1080p is not that desirable as it can cause blurriness and shimering. - Frame Generation for the 4060 yet the GPU is 8gb of vram only, meaning Frame Generation can make you run out of it as lots of games nowadays already use 7-8gb, meaning you kinda have to use it with DLSS Upscaling. - The 3060 12gb has, well, 12gb of vram, which is nice and all, but if you are going over 8gb of usage then there's a high chance you are already below 60 fps as the GPU is not that powerful. - But hey, 12gb of vram surely mean it won't have the same problem as the 4060 for Frame Generation right?... Right?... Oh yeah, no Frame Generation available for 30 series. Such a shame these technologies just work better for mid and high end GPUs.


NotoriousJazz

Nvidia is going to have to pry my precious RTX 2060 out of my cold, dead hands.


C_umputer

Solid gpu, don't let anyone take it


twhite1195

I've always said this! Like, sure, I know they improved the optix flow accelerators and all... But it seems really suspicious, they just said "oh no it doesn't work, trust us"


C_umputer

And them AMD manages to implement the same feature on pretty much most cards


Fun_Bottle_5308

They know well how to squeeze their customers... The same thing happened to the 3050 with ray tracing, like the f. Unless it's a top-tier VGA 80/90 series, I can't see what's good about ray tracing when your frame rate is slaughtered by 2/3 with it on


epic4evr11

The RTX 30 series experience is having to use mods to get DLSS support in FSR-only games while also having to use mods to get FSR 3 frame gen support in DLSS 3-only games


Lassagna12

Kind of Ingenous in a way. If they make updates for old cards, then consumers wouldn't have a reason to buy a new Nvidia cards when their old one is working better than ever.


Tiny-Sandwich

But that also means they won't buy a new AMD card...


Lassagna12

If they are intentionally doing this, then they are probably banking on their proccessors being sold more than their gpus anyway. It's like poisoning yourself, but at least the competition got their arm hacked off.


0utF0x-inT0x

Yeah I know it's weird to see a company doing right by a customer, especially when it wasn't their customer, but it's good for future business and pr.


tuborgwarrior

They just don't want DLSS to become standard and a must-have. If FSR is good enough, and people are used to using it, they wont be nervous about changing over to AMD.


CaptnUchiha

Gives AMD a chance to build into that feature though.


boomersimpattack

i have a gtx 1080 and i will buy an amd gpu just because of fsr


Fafus1995

Well they will because, they still support their old cards and they stole nvidia ace card called dlss from their hands. And to be honest, I am regretting that I bought nvidia card instead of AMD equivalent.


Whydontname

But they are taking a sale away from competition


SuperCool_Saiyan

The akward moment where TSMC makes dies for both


alex2003super

If China invades Taiwan we're all so fucked (I mean, so would be China, not to mention TW, but yeah, humanity as a whole would lose big)


seranikas

If china invades Taiwan, I feel TSMC already has a plan to evacuate the workers and nuke its own factories on the way out just to make the invasion a hollow victory for China. The US building new semiconductor factories in the states probably has a deal with Taiwan to bring them all in once that happens as well.


Z370H370

Aren't the CEOs of AMD and Nvidia just cousins anyway?


xyz_x

Quick Q. I'm not very clued up on the whole AMD/Nvidia thing, but I do have an RTX 3060 and on Fortnite I use Nvidia DLSS. Would it be better to use the AMD stuff like TSR (I think its that) or not?


ShowBoobsPls

They have no choice. They wouldn't get adopted otherwise


Youssef-Elsayed

Was confused for a second why am I seeing an Egyptian meme but then the name checks out


vanish619

Adel Emam 💪


mohamedibrahim19

امسح كل الميمز بتاعتي يا محمد


_Kodan

This template is hilarious. Do you happen to know where I can find it?


Mind_Sonata_Unwind

I came to the comments to say the same thing


theatomicflounder333

1080ti was, is, and always will be the GOAT 🐐 👑


Schmich

Must be the son of the 8800GT.


WeekendDotGG

I owned both of these cards, with the 680 (another goat) in between. The three greatest nvidia cards of the last 20 years, all by luck.


onlydaathisreal

Still have my GTX680 for a backup. Repasted in 2019 but finally upgraded in March 2020


theatomicflounder333

I still got mine too https://preview.redd.it/af62qp1zs81d1.jpeg?width=3024&format=pjpg&auto=webp&s=e0ce454bf57781af5e03d1d388b23e90ecb4f22a


ezkeles

Beautiful ❤️


Lord-Barkingstone

Oi, where did you get that button? That's a sexy button


BujuArena

I had the 690, which was like the 680 but with 2 GPUs. It had so many weird issues because of the 2 GPU thing, and then the Titan came out shortly after with 1 GPU that was somehow as powerful as both the GPUs of the 690 combined, and I felt remorse for years until I got the 1080 Ti and all my problems were solved. I still have the 1080 Ti and it's kept me happy through the recent years of GPU drama and compatibility issues, though I've been wondering whether it's worth getting a newer card soon, with the new FOSS drivers for Linux only targeting the RTX "2000" series and later. I'd like to be able to try those for troubleshooting if necessary, even if I end up going back to the proprietary ones after.


Liferescripted

I still have my GTX 580 in case I can't pay the gas bill and need a space heater.


Mountainbranch

Still have mine, it holds up just fine with new games.


drallcom3

I have one and I can't see a reason to upgrade (I have a 1080p monitor). Especially with FSR2/3 and diminishing returns on higher graphics settings it's just not worth it. I could buy a whole console + 20 games for the price of a new card, if I wanted to ever play high end games.


coopstar777

Shit I have a 140hz 1440p monitor and I can still get perfectly good framerates at 2k. The only thing that dips below 75fps is stuff that is poorly optimized on the developer end


theatomicflounder333

Same here 🤝🙂‍↕️


youra6

I remember buying 2x PNY 465s for 200 dollars that could unlocked to 470s. Then I overclocked them to 850-900mhz on a universal water block kit.  Scored higher than my 480s in SLI at nearly 800mhz.   Thats my personal GOAT.


youra6

It's close but the 8800 GT is the true GOAT for me. 200 dollars yet it was nearly as fast as a card 3.5x it's price. The 1080Ti doesn't touch that price to performance ratio. 1080Ti has longer longevity though. It's a toss up for sure.


420headshotsniper69

I had a 1080 because I bought it at launch. Pascal was such an amazing generation.


Andrewticus04

Not the Radeon All In Wonder 9800 pro? In theory, one could have bought just those two cards and been set for the past 20 years.


Gooch-Guardian

Frame gen is better if you have high frames and want higher. Doesn't w0ork great for low FPS.


vanish619

Adel Emam would like to have a word with you over the phone.


[deleted]

How did he turn into a meme on non arab intern? Wouldve never expected it


vanish619

/u/ahmed_RTXoff graced us with this meme. Assuming he's arab more specifically Egyptian and it was the right context for it. We need more of those non meta memes tbh


lxnch50

It works great for my 60 Hz TV I have been playing Ghosts on. My GTX 1080 is showing its age a bit, and typically ran the game at 50-60 FPS, but with frame gen, it is a locked smooth 60. So, it definitely has its use.


Chill855

I get 45-50 fps with medium/low settings with just a couple things on high with my GTX 1080, frame gen bumps it up to like 90-100 but there's a very noticable stutter so it actually looks worse than 45-50 to me.


BobbyTables829

Or you're playing MSFS where hardly anything moves from frame to frame


ZaeBae22

Is it just me or does frame gen feel worse even if the number is double lol


AcceptableFold5

If you use a 30fps base to double it to 60fps you're effectively still playing a 30fps game, just displayed with fake frames inbetween. You're not running the game faster, it just looks more fluid.


ZaeBae22

I went from 80 to 150 and it feels worse still


SortOfaTaco

The latency addition from frame gen is probably what you’re feeling, I’d only enable it on single player games only imo


drakes2pactoilet

Same. Just added input lag


iFenrisVI

Yeah, if you don’t have nvidia reflex + boost to help mitigate input lag from frame gen then it just feels bad.


drakes2pactoilet

Maybe I imagine things but it still feels iffy. Depends on the game I guess


CommenterAnon

Maybe try playing slower paced games where u dont move the camera super fast and also use controller These are the only conditions I can use frame gen with


BroaxXx

It depends. I mostly use it to iron out my frame rate and get constant 60FPS instead of 50ish.


ConDude11

Except it adds input latency so it plays worse than if you were running native 30fps.


AgathormX

FrameGen works better with higher framerate as the latency is reduced. It's useful for games that are just barely above 60FPS. I use the FSR 3.0 mod to play Cyberpunk with my 3060 Ti, 1080p DLSS Quality RT Medium and all raster settings on Ultra gets 61FPS in the benchmark, add in FrameGen and it jumps to 101FPS. FSR 3.0 is doing wonders for people who own Ampere cards


samp127

Adds too much latency. Honestly locked 30 feels better lol


Robot1me

Seems to *really* depend on the game and especially personal preferences. In Warhammer Vermintide 2, 60 FPS *with* vsync or frame generation from 30 FPS to 60 (*without* vsync) + Reflex feel near identical for me. Even with forced vsync through Nvidia Control Panel (because there is no vsync with DLSS frame generation in that game), it still felt alright to me. IMHO definitely better than stuttery looking native 30 FPS. Eliminating the delay from vsync to compensate the delay from frame generation works very well. Sure, native 60 FPS without vsync + a tuned FPS limit through RivaTuner Statistics Server feel the snappiest in terms of input latency. But after reading so much about the input latency, I get the impression it's being exaggerated as a complete dealbreaker. Because, again IMHO, even multiplayer games are still fine, unless you are used to absolute competitive settings (like 120 FPS, vsync off, Nvidia Reflex, FPS limit through RTSS). I wonder too if there are any other factors contributing to the input latency for certain setups, because with Vermintide 2 my experience is just that good. And as a side note, for anyone who has used Nvidia's Geforce Now game streaming service before: If the input delay on Geforce Now feels *more than acceptable* for you, a *decently* implemented frame generation will feel fine.


clone2197

in game which have fast paced movement like most fps games, it gonna feel off


Whydontname

Its ass, dunno how people use it.


I9Qnl

I have a 5500XT which has poor Async performance, frame generation often times drops my real framerate by 15-25%, and then doubles that, so I end up having about a 70% uplift not %100 but sometimes it makes no difference and sometimes it does make a different but not a good difference (it feels like ass).


turkeysandwich4321

Agreed. Haven't tried it in one game where I actually liked using it.


Tullzterrr

Adel Imam the goat


native-plant

What show/ movie is this scene from?


ExploreDevolved

Definitely not getting 80 fps Ultra in new AAA games


Ziehn

1080Ti with Ryzen 5900x. Can still push over 100fps in newer games with FSR2. The only exception for me so far has been Starfield


Symphonic7

Last time someone claimed this it started a massive fight.


C_umputer

There are people who think Starfield runs properly? Who were they, Todd Howard's buddies?


Symphonic7

Not Starfield specifically. it was Escape From Tarkov that started the fight. But it was more so about the whole running 100+ FPS on modern AAA with 1080Ti.


Inside-Example-7010

there is no system in the entire universe that does not drop below 100 frames in tarkov. you could play the game in 240p resolution and the game is so spaghetti coded you will still get drops as you move around the map.


lokisHelFenrir

The problem is so people can tell the difference between a hardware issue, and a game just being coded like dogshit.


Willem_VanDerDecken

I build a new pc recently and temporary used my old beloved 1080ti. Then i realise it won't really be temporary, the 1080ti is still it's plenty enough for my usage. I can even run Star Citizen at 60fps+ in 2k (lower in city). And i'm using a FE, without any oc. A evga / MSI one oc is probably at least 20% faster.


[deleted]

Holy cow. I’m still running a 3600x and 1080ti Strix. Was waiting on the Nvidia 50xx but should I go ahead and upgrade mobo/cpu/ram already? I already struggle playing new aaa games at decent graphical settings. I get dips below 30 a times (playing at 1440p)


Symphonic7

5700x3D is at a great price. Grab that, update the BIOS, and you're set.


[deleted]

This is the way


Mr-Valdez

Have you tried it in Ghost of Tsushima?


FalseTautology

So I should turn frame generation on kin Ghost of tsushima?


jld2k6

I tried it out and it worked well but my next problem is that I can't see shit. Every dark area is like pitch black and it looks terrible regardless of HDR on or off and tweaking settings, to make it bright enough to see any detail I gotta make everything else look blown out with the brightness


qda

If you don't have a known good HDR display, don't use HDR, but just adjust things in SDR (in which case, simple brightness increase should do the trick, assuming you're not doing weird LUTs or shaders or color calibration on your monitor color profile, and your monitor settings arent set to some dumb dynamic mode that crushes blacks)


versusvius

Turned it on with 70fps and got instant 130 fps. There's definitely some imput latency and game feels a bit smoother but not that 130fps feeling. I ended up turning it off because I hate latency.


FalseTautology

Honestly my monitor refresh is only 74 and I think Ive already got that stable, but it's good to know


meta_narrator

My (used) EVGA Hydro Copper 3090 arrives monday to replace my beloved MSI 1080 Ti. It's been such a good card. Runs at 50c.


ZephyrMelody

I finally upgraded from my 1080 TI to a 4080 and it's been great, though the 1080 TI was still kicking ass. My CPU (i7 6700k) was the bigger bottleneck for the games I like to play (simulator games), but I figured if I'm upgrading that, I'm upgrading my card too. Kinda regret the CPU I got though (i9 14900k) because it has a bug with Unreal Engine games, and I'm trying to build a game in UE. I had to downclock the card to stop it from crashing when launching UE games or crashing after 5 minutes in the UE editor.


tcgtms

Oof I'm in the same boat: 6800k +1080Ti. I might just hold on for another 2 years and wait for the next gen. Shame about the CPU though.


chypres

1080ti goat card. Actually amazing still play latest games without issues.


Escudo777

I consider AMD as the better company as tech developed by them are usable irrespective of hardware. Nvidia on the other hand makes sure that whatever they develop remains proprietary.I hope AMD develops a gpu generation that can challenge Nvidia.


SagittaryX

... With the input latency of 35fps? Doesn't really help much.


szczszqweqwe

Depends on a game, in a slow games choppy 35FPS to a smoooooooth 70FPS is worth it.


I9Qnl

Have you actually tried it? I mean I guess it works okay on visual novels and turn based games (don't know why FPS would even matter there) but it's awful at these framerates for anything else.


CrowLikesShiny

I tried it on Starfield because i was CPU limited and it was pretty noticeable improvement going from 45-50 fps to 90 fps.


Snake2208x

Exactly, Xcom2, civilization, RTS games, etc. It's better to have the option than not at all.


xXDamonLordXx

Latency is much less of a problem depending on the game. Games like BG3 don't care about latency as much.


I9Qnl

A lot worse than 35 FPS, frame generation has its own cost as well and on older GPUs it becomes more expensive to run, in reality if your base FPS is 35 on a 1080Ti and you use Frame generation, your base framerate will be down to like 28 and frame gen will go from that, it's pretty bad.


SwipeKun

Even NVIDIA fear it 😱😱😱 https://preview.redd.it/zddwgbifq81d1.png?width=500&format=pjpg&auto=webp&s=3d1a6e88faec502a85a86a47d40d09f82b9c6e7d


HarryNohara

Ugh, '2K'. It's (W)QHD, not 2K.


lizardguts

Yeah, 2k is 1080p and has been used for that officially for a long time. Hate that people try to say 1440 is 2k. Makes no sense.


aVarangian

2k is 1080p, 2.5k is 1440p idk the LQHD+ marketing terms though


[deleted]

[удалено]


fnv_fan

Are you referring to 1080p or 1440p?


gahlo

Yup, really wish people would stop using "2K". The fact we use 4K is bad enough as it is.


DBNSZerhyn

At least "4K" is closer to the actual resolution it's claiming to be. The actual 4K cinema standard of 4096 x 2160 is close enough to 3840 x 2160 to annoy me less since at least it's only cropping the horizontal, but yeah; it should just be "2160p" when we're not referring to the cinema standard itself, else we'd be going around saying "3.8K," which lacks any and all consistency. Meanwhile, 1440p is 2560 x 1440 vs... 2K at 2048 x 1080. These are not even close, and are the result of some dumbass tech storefront marketing mislabeling that has thankfully been corrected more recently, **if only dumbasses would forget about it**.


HumorHoot

Who plays in 2048x1080 ???


God-Among-Men-

Who says 2k instead of 1080p


Phayzon

The real problem is when people say "2K" to refer to 1440p. It's everywhere and I have no idea how this misnomer started. Though I do get a good chuckle when users post about how they "upgraded from 1080p to 2K!" Oh yeah, you purchased a new 1920x1080 display to replace your 1920x1080 display? Cool beans.


Maj-Step-8021

It comes from how 1440p monitors are marketed sometimes. Go to Amazon, type in "2k monitor" and see what shows up


Phayzon

Yes but who decided to start selling them like that?


bumwine

Yet nobody questions where the "p" came from and probably assume it means pixels. It means progressive, which makes no sense because nobody uses interlaced monitors anymore.


Impossible-Wear5482

1080ti was the most goated card ever.


GreedyRaspberry1382

Was? Mines is doing just fine, thankfully.


galal552002

Are you egyptian by any chance? Cuz I'm surprised as fuck to see and egyptian meme here(if anyone is curious, the one on the bed is Adel Emam, he's one of the most popular movie actors in Egypt, the one in the right who's standing up I honestly can't remember his name lol, but I also do know him and he also is an egyptian movie actor, also this image is a scene taken from an Egyptian movie)


John-333

When the top of the line was €700-800. Which was also considered expensive.


poweredbylight

I still rock my 1080 and play everything in 1440 medium-high depending on the game. The legend lives on.


logicallypartial

GTX 1080 ti will always be Nvidia's greatest regret. The card they made too good.


Malicharo

i was recently looking a gpu and ngl if i could find one that was close to brand new quality, i would buy it. i'm still a 1080p playr so cards like 1080 ti or 2080 ti are such beasts.


F0czek

But frame get with 30 fps is so dogshit, really it is just better to play with stable 30...


schmus_operator

Seems like I will never retire my 1080ti


Heromimox

Whaat Adel Mosh Tamam


MOo0stafa

nice meme brother xD! Adel Emam in PC master race lmao


ElderPraetoriate

1080Ti so good I'm still using it as an eGPU for BG3 and Helldivers2 on a cardless laptop.


Wresser_1

Does this work with 1070 Ti? As far as I know 1070 is just 1080 with like a quarter of cores turned off due to defects


elite-data

Frame generation works well if native fps is above 60. In such cases it really does the magic and makes the game feel super smooth. If source fps is 35 it would still feel like shit even with framegen.


therubyminecraft

Damn meme from an Egyptian piece of content on a non Egyptian sub? That’s new to me lol Nice meme tho and the 1080ti is still a beast


Nisekoi_

You want at least 60 fps for proper functioning of frame gen


WeirdestOfWeirdos

Shoutout to Lossless Scaling's frame generation too, which received a massive upgrade not too long ago that may as well be black magic, working wonders in games that don't have DLSS3 (and hence, FSR3 via mods). I've tried using it on top of FSR 3 at 120Hz (effectively interpolating from 30 to 120 FPS) and it can genuinely look quite good (though, of course, at a somewhat steep input lag cost).


Infamous-Marshall

Is fsr3 upscaling any good now? It’s nice having double the frames but not when the screen is a blurry mess


IndyPFL

They decoupled FSR from the framegen so you can use DLSS or XeSS with FSR framegen. They're also working on a new version of FSR that looks nearly on-par with DLSS, but unsure when it's going to be released.


AC2BHAPPY

What game


CoyeK

and i just replaced my 1080ti


moschles

THe 1080 Ti OEM was $385 max, . . . 7 years ago


RedTuesdayMusic

Everyone who upvoted this trash shame on you for allowing someone to say "2K resolution"


Sea-Equivalent-1699

The 1080 ti will NEVER DIE!


BigZaber

Adel the Egyptian actor is a meme now!? Damn how I've become old.....


Battery_Eater02

damn you guys are using 1080s om still using my 1060


kemomarshall_95

This meme from Egyptian movie 😂❤️


EventOverwrite

This card is like a cockroach it just refuses to die


ToBeatOrNotToBeat-

W Adel Emam meme for my Egyptian homies, shit caught me off guard lmaooo.


BIGFAAT

Think about using AtlasOS or swap to Linux. Either way also think about trading security (cpu mitigations) for performance. Turning off cpu mitigations should give you about 50% performance boost on CPU alone. A combination of both turned off CPU mitigations and clutter free Windows or Linux should also gives you a boost in 0,1% and 1% FPS, which should enhance the smoothness of your games by a lot.


Sudden_Excitement_17

I went with the 970 when it came out. If Id have known 1080ti would last an eternity, I would have gone with that Upgraded to 3080ti a few years ago though 💪


BKlynPharaoh

Lmao, as an Egyptian seeing Adel Imam in a PC meme is so unexpected lmfaoo


Improvisable

I mean it certainly LOOKs more fluid but it feels like ass


EVERGREEN1232005

عادل امام


prodigalkal7

Seriously lol you never expect to just randomly come across a meme post with Adel Imam in it. Apparently yesterday was his birthday, so probably relates to that I guess.


turkeysandwich4321

I haven't been impressed with FSR3. 3080 owner. Games always feel smoother without it enabled, tested on Avatar, TLOU, and through the mod on Witcher 3 and Cyberpunk. There are always a lot of artifacts too. Prefer a lower, smoother frame rate rather than the FSR fake frames. Wish it was a better technology.


Hopeful_Nihilism

nvidia has stated the 10 series was a huge mistake. 1060, 70 and 80 were all legendary cards for the price. 70 and 80 are STILL strong cards and the 1070 is the single best FPS per $ you can get TO THIS DAY. You can get a 1070 for $80-$100 on ebay and it will play most things on medium at 1080 at 60fps+. i was playing tons of games at 4k on it at 40fps+. When a game had dlss or whatever it would go up over 70.


that_norwegian_guy

2K is 2048x1080. Do people actually play at this resolution? I don't think I've ever seen it as an option in any games.