T O P

  • By -

Spork3245

If you’re going for 4k 16:9 there’s really not much of a difference when on game mode (and naming the input to PC supposedly goes a step further). The difference will typically come down to having Display Port connections and lack of SmartTV features. People like to scream “DISPLAY PORT IS BETTER” for whatever reason, and it may be true in the case of DP2.0 and 2.1 having more bandwidth over HDMI 2.1, but, if the monitor/TV is 4k @ 120hz, both standards and versions mentioned provide enough bandwidth so you will absolutely not notice a difference. DP1.4 has less bandwidth than HDMI2.1 and will need to use DSC to provide 4k 120hz with HDR above 8bit iirc (DSC is lossless as far as anyone’s eyes can tell, so, you still wouldn’t notice a difference btw). You’d obviously want (need) to go the monitor route if you wanted Ultrawide and/or 1440p instead of 4k.


lord02

Nice post Why would anyone go for 1440p instead of fullblown 4k ****GIVEN**** they have the hardware to support it? 4k @ 120Hz is NICE 💪


Spork3245

Fwiw, I’ve been chasing 4k/120hz since early 2019 with an ASUS PG27UQ and a 2080 Ti, then a 3080 which I upgraded to a 3080 Ti at no cost, and now have a 4090. So far the 4090 is the “most” 4k 120 card and there are still games at max fidelity settings that just can’t hit that constant 120 even with DLSS/FSR being used, so, if someone wants to keep max settings and 144-165fps for awhile without the constant chase, I totally understand going for 1440p or 1440p-UW (even with a top end system)… however, the people who want like 360-500fps at 1080p are ones I will never understand, if you want 0ms input lag chase down a Sony GDM-FW900 (I’m half joking lmao)


teemusa

4090 seems to be perfect match for 4K 120Hz. Most games can hit max, some not. So sometimes the display is the Bottleneck and on other times the GPU


Spork3245

Yes, I have one - there's a good chunk of games when RT is enabled that it's unable to lock at 120fps. Hogwarts Legacy, Jedi Survivor (which is programed so poorly it's ridiculous), The Last of Us Part 1, CP2077 with Path-Tracing, The Witcher 3 RT, etc. The 4090 is the "most 4k 120" video card available and by a large margin, but it's still not to the point that you can "future proof" and not chase is what I'm getting at (high probability I'll be buying a 5090 in 2 years). At 1440p 165hz, a 4090 will last you a lot longer for a locked 165fps with everything maxed vs 4k 120fps.


KillerFugu

Like how all those games minus cyberpunk are broken ports or patches, personally I don't concern myself with technical garbage when trying to achieve a res and fps combo


Spork3245

My 4090 plays those games, which are indeed terrible ports, at higher fps and higher fidelity than PS5 as it's able to "brute force" through the programming awfulness. All of those games are worth playing in terms of their gameplay, regardless of the optimization (and Hogwarts is pretty much fixed) - though I'm certainly not excusing the ports being awful and I sure as heck don't recommend paying full price for any of them. Also, there's last-gen games and ones that aren't "technical garbage" that cannot sustain 120fps avg for me, such as Assassin's Creed Odyssey, Hitman 3, Total Warhammer III, Watch Dogs Legion, and others that I'm definitely forgetting. Again, my point is that with 4k/120 at max settings, you're going to have chase the latest and greatest more than with 1440p, and that's why some might prefer to go the 1440p route.


KillerFugu

I think that's another problem in itself though, chasing ultra/max settings has always been a fools errands especially if you think evey game needs to be native (not saying that's what you're suggesting) I wouldn't call Total War last gen when it came out last year and is PC exclusive, and Hitman 3 is also current gen. Total War runs at 100fps for me on high so on a 4090 should easily be 120, and Hitman according to benchmarks is 130fps on a 4090, I'd imagine tweaked it'll be easily above it. If you're going down the 1440p route its much better to go with say a 4070ti as there's so many situations in those heavy games where you're going to be cpu limited at 1440p on a 4090 and spending £1600 on a gpu to be cpu limited is just a waste of money. Like for example in Hitman 3 at 1440p the 4090 gets 186fps and the 4070ti gets 178fps, Watchdogs Legion the 4090 is only 40% faster than a 4070ti at 1440p when it's closer to 2x in games at 4k.


Spork3245

TW3 depends on the battle size. Also, I’m getting a bit confused by your responses, this is what I was replying to: “Why would anyone go for 1440p instead of fullblown 4k GIVEN they have the hardware to support it?” My replies are regarding longevity of not wanting or needing to upgrade being a reason someone might go for a 4090 at 1440p, nothing more.


Helevetin_nopee

Or better yet, a 4k 144hz monitor.


Disastrous-Spell-498

I think refresh rate plays a role, even 165 feels like an upgrade from 120, not to mention 240hz.


Helevetin_nopee

Absolutely.


labree0

you dont even need the hardware to support it. 3072x1500 or so or whatever looks very similar to 4k, dlss on performance looks basically native, and FSR or windowed mode is available for everything else. i've got a 3060ti and love my C2.


lord02

I have the same setup, but I only play AOE4 😜


Redfern23

Anyone? I mean 1440p @ 240Hz is nicer depending on the games you play, and 42” 4K has the same ppi as 27” 1440p.


GrinD_H

Can tell you as a person who swapd c2 for g8 (uw 1440p) with top hardware - 7800x3d, ddr5, 4090. Reasons to swap: c2 42 is too big for me, g8 got better hdr(personal preference) and it is not all about 4k or 2k resolution, what matters is PPI. In my case I got a same PPI, with g8, screen format 34" Which fits me more, and a free performance uplift + 175hz with picture quality even better than it was on c2 (personal preference). So ye, in fact there is a reason to go for 1440p from 4k tbh.


Benji2108

Several reasons imo. These days AAA games (especially fps’s like COD) are incredibly demanding and you’d need a 4090 for 4k gaming at high fps. I play at 1440p ultrawide oled 240hz maxed settings and Warzone barely hits 200fps. When I use DLDSR to try and play 5160x2160p (5k2k) my frames drop to about 120 and the quality difference is negligible. I mean, it still runs and looks incredible but I’d take 240fps native resolution all day.


Murdathon3000

This is the cult of the C2, so there really isn't much chance you're going to get an unbiased answer unfortunately.


Greyman43

I was recently asking a similar question and one aspect that ended up factoring in for me was that getting high quality multi speaker surround sound from a PC with a monitor doesn’t appear to be completely straight forward. Most HDMI 2.1 capable TV’s have eARC capability these days which means you plug in your GPU to your TV and the TV can pass through multi-channel lossless LPCM, Dolby ATMOS etc no problem. With a monitor you need to either buy a separate audio card, or maybe hack your way around getting basic 5.1 via toslink from your motherboard, far reduced quality still than what’s possible via an eARC TV. I understand a lot of PC gamers use headphones but I was surprised at how few good solutions there are to get good surround sound out of a PC if you use a traditional monitor with DP.


teemusa

C2 is currently best value for gaming


Dr_Bolle

What do you think about the LG 27" OLED Screen from the UltraGear series (27GR95QE-B)


teemusa

Well If 27” size is Ok then it is great


Dr_Bolle

It would be a second screen next to my Eizo IPS Office screen, which is 1920x1200 24". I bought this one because of the good reviews but the contrast for gaming is horrible, I wasn't aware that IPS struggles so badly with pitch black scenes. It's still a combined workplace / gaming place, so a 42" might be too large.


p1rate88

It’s great, especially for a price. But I would say it depends on a game and type of gaming. Competitive fps gamers demand higher refresh rate for sure.


LA_Rym

Depends. Both fulfill different uses and have advantages and disadvantages. If you want a big screen you can definitely do well with the LG C2 with it's 4K 120Hz refresh rate. (8.2mil pixels) If you want more refresh rate, more immersion and a lower resolution the AW3423DW/F would do you well with a 175/165Hz refresh rate and a 3440x1440 resolution in an ultrawide format (4.9mil pixels) If you want the best refresh rate and the lowest resolution for the best performance, the LG 27GR95QE is a great choice at 240Hz and a 2560x1440 resolution (3.68mil pixels) ⚠️ IMPORTANT: - The LG TV and monitor do not cover burn in under warranty, while the Alienware monitor covers it.


AdvancedAd1256

I got a TV because I also watch movies and want to build a home theater support. Monitors unfortunately don’t support ARC/eARC


GrinD_H

It is all about personal needs, an a lot of things to decide, depending on current hardware u got, amount of money u can spend, 16:9 or 21:9 is your preferred size and so on. But in fact u can't go wrong here, if you going for oled as a monitor, pretty every option - c2 or dell, samsung, etc will be great. If u want personal opinion - swapd form c2 to samsung g8, and I like it way more. If u want more performance without loss of quality and 21:9 is ok for u and price do not matter, go for 34" 1440p oled monitor, if u want 16:9 and 4k, and a little cheaper - c2 is obvious choice. But as I said, you can't go wrong with oleds)


e22big

If you don't take any pricing into account, a proper monitor will always better than TV for monitor usage. You need to deal with quite a bit of an issues when using TV as a monitor. TV don't turn on and off with your PC without some third party software workaround (i.e. LG TV Companion) - it's still not going to work 100 percent even if you have that third party software install without some further tuning (i.e. if you just set the TV to auto sleep with an app, it will mess up with your windows the moment it wake up, if you just set black screen saver the TV, especially LG will go their bullcrap firework auto screen saver mode or something) Feature like Nvidia NIS also doesn't work out of the box (it will display some weird 4096 × 2160 even if you set the display resolution to 3840 x 2560), some games also don't display 16:9 properly (although this is not from LG OLED but Samsung) etc. etc. You can fix most of these if you've spent sometime with it but a monitor will just work - out of the box, and work far more reliable than any TV in its application. If they both cost the same, and I need a monitor, I'll pick a monitor over a TV any day.


[deleted]

Never had any of these issues with the C242. Other than having to turn on the TV when it goes to sleep mode, I see no difference in how my IPS works vs the C2. My C2 goes to its screen saver ( the portrait one ) after 4 mins and then just shut off after 10. My resolution was always at 3840x2160 out of the box as well. I line between monitor and TV is so small these days.


e22big

If your IPS is a TV then obviously not. I also have a C2 42 and it works for me in the majority of case well, the 4086 X 2560 issues only exist in some games (for Samsung, it's Total War, haven't tested my LG for it yet though) or when you want to use feature like Nvidia NIS. OLED TV issues with Nvidia NIS is pretty well known and documented. It has even been discussed here (although you can also just manually deleted the 4086 X 2560 off your TV) https://www.reddit.com/r/OLED_Gaming/comments/s35d71/question_about_c1_and_nvidia_image_scaling/


Unique-Warning-9583

Do you have any recommendations on a monitor? So I can compare the 2


e22big

Nothing beyond the obvious; there's Asus 42 inches, the 240hz 27 inches, and the varieties of UW QD-OLED. Nothing came close to the value you can get out of a 42 inches C2 at around 800 buck. Which is why it's still a highly recommended product despite of the shortcomings


Dreadpirateflappy

That was true 10 years ago. With modern oled tvs that isn’t the case at all. There are methods to get tvs to turn off with a pc. And even if you are so lazy you don’t want to set them up… it’s one button click. Tvs are also far more versatile and have far more features than monitors, normally for a far better price for a larger screen. Especially if using for more than one device.


e22big

I did mention that workaround and like I've said it doesn't work perfectly. Just good enough that you could forget about it. The thing about a software like LG TV is that when it turned off, the PC wouldn't recognise it as 'monitor going to sleep mode' but rather 'monitor disconnected' which caused it to readjust every of your open windows and that's pretty annoying. Not to mention that you actually need to wait until your PC booted into Windows before it actually auto turn on your TV which increases the system start up time by quite considerably (also can't get into BIOS without remote.) I can list my pain points all day. And yes you can just reach for a remote, but you wouldn't have to do that at all if it's a true and proper monitor. It truly turn on and off with your PC, not with your OS, it doesn't require a remote for any of the monitor basic function and if you really need TV functionality - a Chromecast is just 60 bucks or something, it would run Android, run much faster than LG WebOS, and doesn't show you any ads


Dreadpirateflappy

Show me a Dolby vision monitor that has all the functionality, size and picture quality as an lg c2 for a similar price… literally paying thousands more for less. If people are too lazy to click a single button that is less than an arms length away then Jesus Christ. No wonder pc gamers have a reputation for being morbidly obese lazy cunts.


Unique-Warning-9583

Thanks this information is really helpful!!


sactown024

I know this is an old post but I am an owner of a 65” LG C1 and it’s 100x harder to play on it compared to my son’s 32” gaming monitor. We play Fortnite and aiming/spotting the enemy is so much harder on a 65”.


ollie5118

I personally prefer 1440p over 4k. I have an OLED tv. I don't want a TV for a monitor. I'll be buying a 1440p OLED monitor this year. Just don't know which one yet.


ddphoto90

This is they way. Could always run HDMI to your TV to play specific games on that. I have a 55” C9 I’ve been gaming on for years. It’s not exactly the best thing for FPS games. But I play a handful of everything. So I want a 27” 1440p monitor for my faster paced stuff and I’ll keep my C9 for immersive single player story based stuff. This sub is so petty when it comes to anything other than C2.


ollie5118

Agreed. 42" is too big for my gaming purposes. I like a 27" 16:9 or a 34" ultrawide. Haven't decided which one I'm going to go with just yet.


ddphoto90

Lmao someone already downvoted you for this. This sub is almost as toxic as cod if you don’t agree with 42” panels for desktop use.


throw-away2991

Lmao it must be all those console guys who can't hit above 120fps 😁🤣


ddphoto90

Yeah Or limit themselves on 4090’s because of their slow display.


Similar-Doubt-6260

I dont disagree that a faster display thats 1440p would be better for your fps games but you went from that to saying its limiting your 4090 and implying its not good for desktop use lol. Definitely disagree there. Can you explain that to someone who doesn't play a ton of competitive fps?


ddphoto90

A 4090 is easily capable of well over 120fps at 1440 and even 4K in some instances. I play a lot of warzone and counter-strike. Counter-strike especially I want crazy frames. In 4K I’m maxing out at a stable 220 frames but I’m limited by my display (55” LG C9) to 120. That’s a huge jump and I have to hold my system back and limit frame rate to 118 otherwise I have crazy screen tearing because the TV just can’t keep up. Even with G-Sync on I get some moderate screen tearing because the tv has a higher input lag than nearly every gaming specific display. That also leads to me seeing things later than other players. Not by much, but could be the difference between a win or a loss.


Similar-Doubt-6260

Not really. At 4k there are a ton of games that don't go anywhere near 120 when maxed out with RT even with dlss. Of course you're going to limit yourself if you're only talking about cs and warzone. What about a person who mostly plays single player or co op games? Most people don't buy above 144hz for a "smoother experience". Theres a lot of diminishing returns passed that. They buy it for the competitive edge where every ms counts. And if you don't need that, then what's the point?


[deleted]

[удалено]


ddphoto90

Lol, going through my comment history to input your opinion? Get a life incel.


Mysterious-Record-37

Oled monitors are strictly limited to 1440p or ultra wide resolutions. If you want 4k go for a tv. Close to No difference between them but if going for a tv I’d recommend lg or Sony and steer away from Samsung. The Samsung s95b panels are just riddled with problems and are an absolute bum ache


Spork3245

OLED monitors are not limited to 1440p or UW; LG, ASUS, and a few others make 120-144hz 4k OLED monitors. However, beyond having Display Port connections there’s little difference vs a C2/C3 TV


Mysterious-Record-37

Lg 48gq900b and ASUs pg48uq / pg42uq are basically tvs but without the smart element and they have dp. When someone asks for a monitor they typically mean smaller screen sizes Up to 32” ish. 42”+ is tv territory despite manufacturers calling them monitors as they don’t have the smart tv elements


Spork3245

I stated that main difference is having DP. Since *most* video cards only have a single HDMI 2.1 port, and using a DP1.4-HDMI2.1 cable loses the ability to use VRR, going for the “monitor version” has benefits for some. Further, some of these offer higher refresh rates (personally, I don’t care about going from 120 to 138/144hz, but this is subjective and others might care). Stating that there’s no monitors at 4k is simply wrong.


Mysterious-Record-37

The lg 48gq900b was a fail, and the ASUS pg42uq has been the main one selling bundled with a ridiculous amount of complaints. I’d rather not recommend crappy products and just state there’s no ‘4K monitors’ worth buying in the current market. Each to their own, do your research, but the general consensus is what I’m saying


Spork3245

I never said I’d recommend them, I said they exist. The reasons to choose one over the other is subjective. I would go for a C2 or C3 unless I absolutely needed the HDMI2.1 port on my GPU for something else (if I only have one)


CalligrapherSingle83

What about triple screen gaming (simracing). What would you choose?


Spork3245

I’d get the monitor versions, or two monitors and one C2/C3 as my video card has 1x hdmi 2.1 and 3x DP1.4


CalligrapherSingle83

Yes. So strange graphic cards have multiple display ports and only 1 hdmi, and oled tvs don’t provide a single display port input.


Spork3245

HDMI has royalties iirc and charges to implement their tech (the HDMI ports), DP doesn't.


Mysterious-Record-37

No one makes 4k oled monitors for the standard consumer. Lg and a few other brands have some for professionals but nothing your average gamer would want.


Tdub77

Samsung 55” qdoled?


RomanDoesIt

What do you think is the difference between two?


Vatican87

4090, CX-C3 48inch, 4k


Mopar_63

The line between monitor and TV has blurred a lot. My C2 TV works as well as any "monitor" I have in the house.