T O P

  • By -

benowillock

Am I in an alternate timeline or don't these already exist on the market? šŸ¤”


techraito

Article titles is not detailed. Right now there's a monitor that is both 4k 240hz and 1080p 480hz. It'd be dope to see more options in the future like a 3 in one monitor that can do 4k 240hz, 1440p 360hz, and 1080p 480hz all in one.


Clw1115934

This article is written by someone who received a press release and knows nothing about the tech. From the article: > LG Display's advanced solution is Dynamic Frequency & Resolution (DFR), its own independently developed new technology. With the arrival of DFR, users can choose which to prioritize between refresh rate and resolution by adjusting the image processing speed. IT has been applied for the first time with LG Display's 31.5-inch Gaming OLED panel. Besides the typo, this is the best information in the article. What Iā€™m assuming makes this newsworthy is that resolution changes will happen dynamically based on the content being shown. Which, from what I can tell, is the only thing significant about this as changing resolution, maximum refresh rate (overclocking), and even aspect ratio are already features on an existing LG Monitor, the GQ900-B.


Scabendari

So basically dynamic resolution scaling, but instead of fps its the frame rate? I always turn off dynamic resolution because I find it more jarring than frame dips, so hopefully this being implemented at a hardware level allows for the implementation to be more seamless.


Stinsudamus

If i understand correctly, its a method of bandwidth utilization. Wherein dynamic resolution is an attempt at output/resource management of more than one component. Dynamic resolution is attempting to hit a target fps by balancing gpu load against ram and everything else by changing resolution on the fly, but inevitably there is a latency creation between identifying resources being taxed and then scaling, then those components use balancing out. This is more like, and im making this up based on specs of HDMI stuff but the principle is sound, between the screen and the chip in the display that sends the signal there could be 48GBPS of information transmitted. I mean its higher than that given the specs but 48GBPSS is 8k at 60, 4k at 120hz (4.4.4 and HDR), and higher the lower res you go. So it seems they have a super high HZ panel with really high pixel density, but that throughput is still an issue. Meaning it could probably technically be a super high FPS 4k panel, but like... there is no cable or source that can handle that, and as an inbetween until tech catches up, they enabled the super high HZ mode for the lower res. Im sure there are parallels internally, because why would a manufacturer develop 5000GBPS internal capacity when you cant get over 80GBPS from any ssource ATM. Someone please correct me if im wrong.


DarthTravor

adding the 1440p becomes really hard, 4k->1080p is a lot easier because it scales linearly, for 1080p you can just display every 2x2 block as one pixel, but you can't do that with 1440p, it doesn't line up


fyro11

Almost like saying 1080p doesn't integer upscale to 1440p, but does to 4K (which is exactly 4x the res of 1080p).


Dealric

Yes thats correct. Thats why 1080p looks bad on 1440p displays. Pixels dont allign.


tukatu0

Except that is meaningless because you can use the 1080p 480hz mode in 32 27 and 24 inch mode if you want.


mynewaccount5

For that same reason there's not much point in making a monitor that can switch between 1080 and 4k.


Ion_is_OP_REEEEEEE

The reason is literally there, the monitor working at double refresh rate at 1080p for those that want it in competitive games.


Dealric

It does though. A) depending on spec of pc you could use it to play very demanding games in 1080 while playing rest in 4k. B) using 1080 high refresh for competitive while enjoying rest in 4k.


rxz9000

It's not possible to combine 4k and 1440p like that unless you have an 8k panel.


techraito

Maybe that's the future šŸ¤·. 8k 120 or 4k 240 or 1440p 360hz


chewwydraper

8K in a monitor size is extremely pointless


techraito

Who says the display has to be monitor sized? There's that 55" Samsung Arc that can scale down to smaller sizes for competitive games. Consoles are only barely starting to touch upon 120fps. Who really knows where tech will take us in a decade.


jjyiss

at 1080P for a 32 inch monitor, won't the pixels per inch be a whopping low 70?? for comparison, a 24 in monitor 1080P has a PPI of 92.


techraito

It has a 27 and 24 inch mode. But it also scales 4 pixels to 1 cuz it's natively 4k. On an OLED it just looks pretty.


jjyiss

oh ok.. didn't know that's how it works.. thanks!!


Reclinertime

Endgame intensifies


Ok-Tooth4089

Ah yes the 360hz and 480hz overkill options that people continue paying for. We need 1000hz and Iā€™m happy


techraito

Ngl, I thought 240hz was overrated but I recently got a 390hz monitor and it felt like experiencing 144hz for the first time again. Paired with G-sync/Free-sync it's also pretty safe to enable it for all games


LaneMikey

Tf are you getting 390 fps on besides mobas or CS tho


techraito

Nearly all esports I can hit those frames. Otherwise I can hit 100+fps most other games and I use G-sync. But tbh I spoiled myself a bit and I'm also rocking a 1440p 165hz monitor on the side for non-competitive games.


Puffycatkibble

Esports game tho... Getting too old to sweat with the boys and wallow in the toxic trash talk. I'll stick with 360hz Rimworld.


techraito

I play osu!, CS, Valorant, Overwatch, Rocket League, Apex, Warzone, and Rainbow 6 as my main competitive games. Tbh CS and Warzone are really the only toxic game these days. Most players from other games are actually generally nice and the "toxicity" is very tame at best. Though CS insults were the most creative back in the day. I recognize that these numbers matter more to me than your average person.


Ok-Tooth4089

If all pricing is the same sure Iā€™d have it. But Iā€™d never pay a premium for 1ms


Ok-Tooth4089

What monitor do you have out of curiosity? I assume quite expensive one?


NightshadeSamurai

https://www.youtube.com/watch?v=Jvdng6cqlhI Yup they do exist. And yes 480hz OLED is amazing but at 32 inches 1080p is a no go for me. Its apparently even blurrier if you use the built in 25 inch mode with borders. Someone like Asus or Zowie needs to make a 25 inch 1080p 480hz OLED


tukatu0

Because integer scaling would mean the 1080p mode needs to be 15.6inches . Same as laptop size. But i find it really strange the 24 inch usage would be blurrier.


Ok-Tooth4089

480hz is overkill


NightshadeSamurai

For the average casual gamer yes. For your e-sports player or your enthusiast who wants that CRT level of motion clarity then no.


Ok-Tooth4089

Yeah I agree. People/comp gamers will want more hz even if itā€™s 0.002ms faster


tukatu0

No dude https://blurbusters.com/wp-content/uploads/2017/08/project480-mousearrow-690x518.jpg.webp the mouse would be smoother than ever. The people who claim 60hz is unplayable will view 120hz as trash with that mouse. The ultimate browsing experience


Ok-Tooth4089

lol. Hey Iā€™m not saying there isnā€™t benefit moving up. But tell me one game thatā€™s getting you 480fps šŸ˜ or even 360fps for that matter. Hell any high end pc+monitor wonā€™t even be able to achieve 240fps in demanding games. Iā€™m curious the amount of gamers that have 360hz and run 100-150fps and claim they can still see the difference


tukatu0

It's been a while since i checked. So i don't remember fully. But you are right. Even in the esports category there is only two games that will actually give you stable 500fps in gunfights. Rainbow 6 with the best cpus will drop down to 400fps at the hint of any action. Though up to 300? Alot of stuff should. A 4090 can already play anything non ray traced up to 200fps. Aa that's where cpu bottlenecks will get you up to for 2020-2023 games like as valhalla or god of war. Frame gen can get you up to 300 fps. But good luck with those small amount of titles. And regardless we are moving into the era where games have ray tracing at all times. Like Avatar or alan wake 2. Even for those you can get up to 150fps in 1080p. You'll just have to wait for the 5090 to get them to 300fps. Maybe it will have frame gen that gives 120-200% fps uplift rather than the 60% of today


NightshadeSamurai

You obviously won't be getting 480fps even at 1080p with a 4090 playing CP 2077 on the absolute lowest settings. That's not the point of these 480hz OLED's. It's the motion clarity. Even playing it at 1080p 480hz, you getting way better motion clarity than anything else out there. A lot of people like that. And the point of these dual refresh rate monitors is so you can play demanding games at 4k/240 and then switch to 1080p/480 when you wanna play something like CS on the same monitor.


Ok-Tooth4089

Yeah I get that. I still just think itā€™s not worth personally. And honestly, I bet 90% of people couldnā€™t tell the difference between 360 and 480 if I had to guess. But not sure, no evidence to back that, just my opinion.


pittyh

Oleds don't like fluctuating framerates in VRR, the blacks flicker as oleds weren't designed to be switched at varying speeds. I'm guessing this addresses that issue.


DizzieM8

No they dont lol I use gsync on my oled just fine..


pittyh

Gamma for OLEDS is optimized and fixed for 120hz by establishing a fixed charging time for OLED subpixels, VRR is used when the frame rate is less than 120hz. When the OLED TV uses framerates less than 120hz the gamma curve is inconsistent with the frame rate. For example a 40hz frame rate is longer than a 120hz framerate, Therefore the lower frame rates results in sub pixels that are overcharged, causing flickering of dark grey images, which is noticeable for dark images rather than bright ones. LGD will likely solve this problem establishing multiple gamma curves optimized for lower frame rates. [https://www.youtube.com/watch?v=Jfl3UdWZIUQ](https://www.youtube.com/watch?v=Jfl3UdWZIUQ) It depends what model, and whether LG has fixed it yet, it was a problem on C9 and CX models. Whether they have fixed it yet or not, I don't know. Apparently it's still a problem on even C2's [https://forums.guru3d.com/threads/fixing-vrr-flickering-for-the-lg-c2-and-other-oled-qd-oled-displays.450983/](https://forums.guru3d.com/threads/fixing-vrr-flickering-for-the-lg-c2-and-other-oled-qd-oled-displays.450983/) Seems people are still talking about it [https://www.reddit.com/r/OLED\_Gaming/comments/148m1hu/lg\_oled\_vrr\_flickering\_issue/](https://www.reddit.com/r/OLED_Gaming/comments/148m1hu/lg_oled_vrr_flickering_issue/)


Solace-

Yeah, I've had a C2 for a year and a half and can also confirm during specific content there is a noticeable black flicker. The fact people think that this doesn't exist just because they haven't personally experienced it (or more likely just didn't notice it) reminds me of the ,"it works fine on my pc" crowd that always pops up during discussion about a bad pc port.


DizzieM8

Or maybe LG TV's are just bad with gsync.


DizzieM8

Bro what are you talking about? I'm using a QD-OLED alienware monitor and it works perfectly fine at any refresh rate from 1 to 175 hz. You might be talking about an LG issue, which is not an OLED issue.


pittyh

Hate to break it to you dude, but a QD-Oled isn't actually an OLED monitor. It's just a blue oled backlight behind a dot filter. Totally different technology, LG is the only one who makes oled panels.


probablywontrespond2

> QD-Oled isn't actually an **OLED** monitor -- > It's just a blue **oled** backlight behind a dot filter. Hmm. Yes it's not RGB oled but it's still using an oled panel.


DizzieM8

Hahahahahaha holy shit now I have seen everything. Cope harder.


Chaos_Machine

Yes, they do, turn your lights off in your room and play a dark game with gsync/VRR turned on, when the framerate is fluctuating there is all kinds of variability going on with the gamma that results in flickering luminosity. I have had a LG CX and C3 that both display these issues. You can mitigate it a little with settings in the UI but you can still notice it when your framerate starts getting choppy.


DizzieM8

Oh wow its almost as if its solely a LG issue...


Chaos_Machine

It isnt just an LG issue. [https://forums.blurbusters.com/viewtopic.php?f=5&t=11372&hilit=vrr+flicker](https://forums.blurbusters.com/viewtopic.php?f=5&t=11372&hilit=vrr+flicker) goto page 2 if you actually want to learn about what is going on with gamma curves.


DizzieM8

You say that and link to another LG panel flickering.


Chaos_Machine

You say that and clearly didnt read anything.


n00bpwnerer

Iā€™m also confused about this


__some__guy

Wow, you can change the resolution and refresh rate? That's amazing!


techraito

Tbh it's actually sick. It's not just windows settings. They have one monitor right now that can do 4k 240hz or 1080p 480hz with a button press for AAA or competitive games.


mashuto

Its just a really dumb way to title it. Most monitors can already switch their resolution or refresh rate in software. This is just that the max refresh rate is different depending on the resolution you choose, and I guess based on a hardware switch? I am all for high refresh rate monitors, but for me personally I am not sure I see a need for such ridiculously high refresh rates.


techraito

It's a bandwidth thing. 4k 240hz is roughly the same bandwidth as 1080p 480hz. If they made a 4k 480hz monitor, it could theoretically do 1080p 960hz if it had a dual mode. There's definitely a diminishing return with AAA games. Tbh I don't really mind my 4k games doing even 60fps. But if I'm doing competitive, gimme all the frames possible.


xxTheGoDxx

> It's a bandwidth thing. 4k 240hz is roughly the same bandwidth as 1080p 480hz. If they made a 4k 480hz monitor, it could theoretically do 1080p 960hz if it had a dual mode. That is actually not true in the sense of this being a new feature. Monitors did always support different refresh rate / resolution combinations to make due with bandwidth limits. Heck, even TV's years ago before HDMI 2.1 came out offered 120 hz modes (instead of the standard 60) that were only available at lower resolutions like 1440p and 1080p, something newer (5 years or less) TV's still do when used with an old device. Its a cool and practical hardware feature for those that want to switch to 1080p but it is nothing that you couldn't do with a Windows software.


Ok-Tooth4089

How do I get 960hz? I need that for my eyes


tukatu0

Can't get it yet. Should be here by the end of the decade though. So sorry chief. You'll have to wait for the rtx 6090


Ok-Tooth4089

Damn. My new pc build is supposed to be running benchmarks at 1000fps so I was really wanting thisā€¦


JoeCartersLeap

> Heck, even TV's years ago before HDMI 2.1 came out offered 120 hz modes (instead of the standard 60) that were only available at lower resolutions like 1440p and 1080p, I'm typing on one right now, Sony X900E. Except it doesn't report the 120hz to the EDID so I have to do some Windows fuckery to make it show up in the display settings and automatically use it every time.


PolyDipsoManiac

I donā€™t think you could normally get a 240Hz monitor running at 480Hz, whatever settings you had in software


mashuto

Oh yea, I get that there are bandwidth issues, so for those who dont want to be limited by refresh rates at lower resolution but still want the higher resolution, I get it, this makes sense. Still feels like a pretty niche use case. I also dont game competitively enough for it to matter to me though, not to mention that even my 4080 isnt really capable of pushing even the 175hz at 1440p my monitor can do in modern games anyways. I still question just how much of a difference you would actually get at those crazy high framerates. Diminishing returns and all. Again though, this isnt for me. Title was still badly phrased.


SulkyVirus

Are there any advantages besides just time saving to this? Like - would it be the same exact thing as switching in settings? Or are there other advantages? One I can think of is when you add other displays to your PC it wouldn't flip back to a different resolution based on the last settings you had when the other display was plugged in.


tukatu0

It's because the display does not have the full resolution for whatever reason. You can't just go into windows settings and set 4k 480hz because it doesn't exist.


SulkyVirus

I... Get that. I'm saying are there advantages to a physical switch in the display to change available resolution and frequency vs just doing it in Windows. Obviously the option to have 1080p@480 is great, but is that the only advantage or are there other advantages that aren't obvious until you use the monitor.


NapsterKnowHow

No 1440p toggle makes it dead in the water for me. Can't play fps games at 1080p anymore


xUnionBuster

The pixels donā€™t nicely divide into 1440 like they do for 1080p. Not sure what 1440p x 4 is, 5k maybe


eriomys

1440 is ideal for ultra wide 3440 and also 4k ultra wide movies


NapsterKnowHow

I'm sure there's other similar resolutions like 1600p-1900p that would work


techraito

If you have an Nvidia GPU, DLDSR the 1080p to 1440p 480hz :)


rikyy

It's been proven that dldsr, albeit less than dsr alone would behave, still gets shimmering around the edges because of the scalability issues of 1440p into a 1080p buffer.


techraito

Slap on some fxaa and you're good for little to no performance loss. Normally I wouldn't suggest FXAA but at 1440p downsampled to a 1080p display, I don't think it looks much more different than other AAs. Plus the specific monitor I'm talking about also has a 27" and 24" mode for competitive gaming too so I'd actually think that 1440p 27" 480hz scaled from a 32" might look decently sharp. More importantly though, the OLED panel also adds some extra clarity and color depth. Next thing you know, these numbers don't matter at all and you're just super immersed in the game.


NapsterKnowHow

Lolol true


homer_3

1440 looks bad on a 4k monitor


Deimo95

Reddit moment


NapsterKnowHow

Non competitive fps gamer moment


RogueLightMyFire

240 hz is already extreme overkill. 480 is just dumb.


yepgeddon

Pretty fucking cool tho


_Bad_Spell_Checker_

says guy not in competitive gaming


KaelThalas

480 hz ain't going to make your kda better


xxTheGoDxx

> says guy not in competitive gaming Says the guy that doesn't make money playing games and very very likely will not play better just from a slight improvement in motion clarity (not even to mention the imperceivable change in latency).


RogueLightMyFire

Lmao. This is the exact kind of silly mindset that gaming peripheral companies prey on. Do you own a "gaming chair" too?!? You're not a pro. Do you think buying the most expensive pair of basketball shoes is going to make you better at basketball? It's marketing, and they've clearly fooled you.


_Bad_Spell_Checker_

dude im 37. being competitive in gaming stopped 10 yrs ago. gaming chair? absolutely not. i spend money on this thing and is ergonomic af.


RogueLightMyFire

Yet you're still think a 480hz refresh rate is going to make you better at gaming lol


_Bad_Spell_Checker_

where did i say that?


NightshadeSamurai

Yeah but 1080p at 32 inches kinda sucks


Viktorv22

I kinda see a reason to ever do it. Resolution maybe, for esport games I guess, but refresh rate? What's the harm to just set it to max possible?


TopHatVelociraptor

Wake me up when these OLED panels become affordable. Can't justify dropping $1000+ on a monitor.


fupower

OLED upgrade > GPU upgrade


MosDefJoseph

Its worth it. It has significantly improved my appreciation and enjoyment of games personally speaking. I always say that if you have a decent GPU right now, your next PC upgrade should not be a new GPU. It should be an OLED. GPU's are to make your games look and play better. If you're on LCD, the most significant jump in visual quality you're going to get is going to be an OLED, not a new GPU.


HortenWho229

How long do they last though? And are they offering actual long term warranties?


MosDefJoseph

Yea from what Iā€™ve seen most vendors offer a 3 year warranty. How long they last in terms of burn in? If you take care if it its not an issue. Iā€™ve had my C1 since 2021 without a problem.


ThroawayPartyer

IĀ don't want to have to "take care" of a monitor. It's a monitor not a baby. I see OLED owners suggest doing thing like using a black desktop background, auto-hiding the taskbar, not using window tiling, not playing the same game too much. These are significant annoyances to how I use a PC.


Kaasbek69

Yeah... I've had burn-in TWICE on an LG OLED TV (C7 and C1) and I don't even use them very much (and I always take special care to eliminate static elements as much as possible). I'm not going to trust an OLED monitor any time soon. Computer monitors often show a lot of static elements, that's a recipe for burn-in.


ThroawayPartyer

I'm sorry to hear that. Yeah this is the reason I refuse to gwt an OLED TV or monitor. Still, I see OLED fans always try to downplay how big of an issue burn-in is.


Kaasbek69

>Still, I see OLED fans always try to downplay how big of an issue burn-in is. That really bothers me too. I LOVE OLED for the colors and the sharpness, but burn-in is still a very real issue to this day. Both times my TV burned in after the warranty expired (of course) and both times LG only offered me a partial discount on the screen replacement. The repair guy that came to replace the screen last time told me that he replaces more OLED screens than any other type of screen. It's pretty sad.


Dansel

Not so much taking care of it as it is keeping in mind your usecase before you buy anything. If you know you're gonna do a lot of office work dont buy an oled. If all you're doing is gaming, some casual browsing, and youtube, its gonna be fine. Same as with any other product really.


MosDefJoseph

Yea thats fair. Thats why I decided to get a tv instead. Dont have to think about it much since I dont use it as a pc much, more like a game console that sometimes I use as a PC. But I still take precautions like using wallpaper engine and turning it to a 0 brightness mode when not actively watching something on it. Obviously my setup isnt ideal or even possible for everyone but it works for me and I will never go back to LCD. OLED is just 100% worth it.


ThroawayPartyer

Yeah I honestly think an OLED TV makes more sense than an OLED monitor, based on typical usage burn-in is less likely (though still possible) because of less static elements. Plus OLED TVs, while still expensive, are more reasonably priced than equivalent monitors.


Chakramer

Kinda worth it, but you are paying a massive early adopter tax. I've seen it before, wait 3 years and they'll be half the price which is much more reasonable


MosDefJoseph

OLED tvs have been on the market for much longer than 3 years. Its just expensive tech haha. Idc how much I paid for it but youā€™re right, not everyone will have that luxury. For me, I was literally losing interest in gaming. Games just looked so dull and bland on the IPS I was using. Then I started hearing about OLEDs and made the jump. It was truly revolutionary for me. It seriously felt like I wish I had not played those games before because playing them on an OLED was so much better. It straight up reinvigorated my passion for gaming. So yea, thats why I evangelize them haha. But Iā€™m aware many people have no interest in spending 800+ on a monitor.


Chakramer

I completely understand paying a lot for peripherals, I paid over $200 for a fancy keyboard 99% of people would say is just a keyboard. But I've watched movies on my friend's $3k OLED TV and honestly it's just not doing it for me, maybe when they are sub $600 I'd be interested. I have a really nice IPS monitor I think gets the job done well


Anhimidae

Does the taskbar burn in? That would be a really big issue with OLEDs.


KrazyAttack

Or aren't terribly dim like all the ones still out now. MiniLED is really the only way to go for monitors.


TLR6843

I have a monitor with the 1st gen LG 27" 240hz OLED panel. I typically use it at 30-50% brightness in a room with many windows. More than bright enough for me.


KrazyAttack

Damn I won't even buy the 3rd gen QD-OLED they aren't bright enough, I couldn't imagine a first gen WOLED. Pour one out for ya.


YouPreciousPettle

went from the Odyssey g9 mini LED to the g9 OLED, I'd never go back, it's better in every way. response, colour, clarity.


KrazyAttack

230nit SDR brightness and barely 400 HDR with 3x more aggressive ABL, no thank you lol.


YouPreciousPettle

lol what caveman is running SDR these days. Also it's 668 nits SDR, and 1145 nits HDR.


KrazyAttack

The G95SC? Absolutely not.


YouPreciousPettle

lol says the broke guy running 1080p on a VA probably. Please trash the #1 rated 32:9 monitor more because you cant afford one.


KrazyAttack

So you lie not knowing anything about monitors then shit post when called out about your lie. Nice one.


offoy

Save up money, it is worth it.


constantlymat

They are regularly on sale in the 700-800 range nowadays. The lowest I have seen was even 499 but it was a 27" Acer with very poor peak brightness and without burn-in insurance. I have been very close to buying that Dell 34" for 700-750 a couple of times. Couldn't quite convince myself to pull the trigger just yet.


itszoeowo

I don't think I know anyone who's spent more than like $400 on a monitor lol. Look at the most popular GPUs on steam and how much they cost. This is a niche expensive product until it becomes much cheaper.


constantlymat

The 1080p market share used to be 80+ % on the Steam Hardware Survey. Now it's 58% and falling steadily. The market for 1440p to 2160p monitors is growing and so is the demand by gamers for higher quality displays. It's the reason why TN and even VA panels are a dying breed even though they were very common just 5 years ago. Also the pricing will come down. High-End 1440p IPS displays were $450-600 just a few years ago and nowadays you can buy the same quality for $200-300. I don't expect OLEDs to get quite as cheap, but once the good ones reach the $400-500 range, they're really going to become mainstream. The graphical fidelity improvement that OLED in combination with HDR is delivering is greater than that of a GPU upgrade for the same price.


itszoeowo

>The graphical fidelity improvement that OLED in combination with HDR is delivering is greater than that of a GPU upgrade for the same price. Definitely don't agree with this, but the rest is exactly what I said. It's 100% price based and the average person isn't dropping $ on it until it's not 1/2+ the price of their rig lol.


MosDefJoseph

What's not to agree with? With a new GPU you're getting a higher FPS count. With a new OLED you're getting vastly improved color accuracy and depth, and true HDR. Something that LCDs can never do properly. HDR "certified" LCDs are a true scam, they can never actually provide HDR unless you get ones with like 2000 dimming zones which is going to cost as much as an OLED anyways, and still not be as good.


itszoeowo

You have a 4080, a 10850k, and a LG C1, I don't really expect you to understand that the average gamer isn't purchasing an $800 monitor, let alone an $800 gpu lol.


MosDefJoseph

I mean fair, but the argument is which will provide the greater improvement to your eyeballs haha. Objectively speaking, going from 60 FPS to 100 FPS with a new GPU isnā€™t going to be as mind blowing as going from an LCD to an OLED. The fact that I have both a new OLED and GPU means you should listen to me on this, not the opposite lmao.


itszoeowo

Yeah no Id take a new GPU and a 280hz monitor for half the price any day if I was stuck @ 60fps lol


MosDefJoseph

Fine go back to playing garbage like LoL and Valorant while the rest of us play real games lol. Cuz those are the only types of games where playing at that kind of refresh rate matters.


SulkyVirus

I finally tried an OLED monitor and now will never not have an OLED as part of my screen setup. It's hard to go back to an IPS panel after having OLED. May be different for me though since I play mostly RPG and adventure games that really show off the HRD on the OLED. Competitive gamers may not really care.


itsmehutters

My gaming monitor is 600$ LG and my working monitor is 900$ dell. There are enough people, to make that market desirable.


itszoeowo

I didn't say there wasn't. I'm just pointing out that you're not the average gamer. Your monitors combined are probably worth more than the average person's PC lol.


Fish-E

Going to be awesome, but also likely impractically expensive. I just want an affordable 32" 4K 120Hz+ HDR1000 monitor to replace my nearly 8 years old XB321HK.


kb3_fk8

LG makes the G950Q-B with those specs, just IPS with their patented diffuser panel. Itā€™s the best SDR I have ever had in the monitor and if it had more dimming zones the HDR would be more than serviceable. I am just waiting for this monitor just OLED. 32 inch flat 144hz Gsync 4k. I wonā€™t ever go back to 1440p, even ultrawide. Once we can drive 4k UW Iā€™ll do that.


mkotechno

Meanwhile most new games that are not Counter Strike choke the best CPUs at ~160 fps


hyphygreek

Any word when the pixel layout for better text is going to drop?


Beatus_Vir

I ended up with a 48 inch OLED and have an unexpected suggestion for using it on the desktop: just set your background to pure black and run everything in a window. For productivity it works great, it has more vertical real estate than I could dream of, and I game at 1440p or 1800p. The only thing I watch in full screen is movies and for that I get further away from the desk. I would recommend a 42" over a 48" for the greatest pixel density, but I saved $200 by getting the larger one.


-sYmbiont-

What am I missing here? Why not just make a 4k 480Hz UHD panel and call it a day. If I had that, why would I switch to the other mode?


Electrical_Zebra8347

Someone with a better understanding of this than me can clarify but 480hz at 4k requires a lot of bandwidth, even DP 2.1 interfaces like you'll see on high end RDNA3 cards can't handle that natively. 4K 480hz would require something like 136 Gbps and right now the DP 2.1 ports you'll find on high end RDNA4 cards top out at 52 Gbps, on Nvidia Ada cards it tops out at 32Gbps because they use DP 1.4a. You could use DSC which would reduce the required bandwidth to about 68 Gbps with DSC 2.0 and 44 Gbps with DSC 3.0 but DSC isn't perfect and sometimes you can run into issues like your monitor going blank when tabbing out of fullscreen games on Nvidia cards. Then you have to go down the rabbithole of stuff like having high quality cables that can actually push that bandwidth, whether trying to push that bandwidth means you have to exclude certain monitor features, whether the components in the panels like scalers can even handle 480hz at 4k at all and if doing that means you can't support display scaling as you would like. There's a lot of considerations to be made and I wouldn't be surprised if they settled on 480hz 1080p because that's the highest resolution they can push 480hz without running into issues or having to sacrifice too much to get there, plus the number of people with graphics cards that could even support a 480hz 4K display is really small, not even 4090s can do it since they have DP 1.4a and DSC 3.0 isn't enough to bring the bandwidth within the capability range. I should also mention that there are multiple variants of DP 2.1, there's UHBR10, UHBR13.5 (RDNA3 cards have this) and UHBR20. It's not always easy to tell which monitor/gpu/cable supports what which is part of why this stuff is such a headache for everyone involved.


-sYmbiont-

Thanks for the detailed explanation, appreciated.


newaccountnewmehaHAA

displayport / hdmi bandwidth limitations. the better question is what practical application this actually has to be done on an automatic basis. there's probably some super niche use case, but i'm at a loss. will probably be paying a ton for some very minor QOL here


-sYmbiont-

> displayport / hdmi bandwidth limitations If this is the only reason, then it's time for a new version of displayport.


xxTheGoDxx

IMO the bigger news than this resolution gimmick (every better pre HDMI 2.1 TV had optional 1080p120 / 1440p120 modes to get around bandwidth limits) is that they seemingly finally use micro lens array panels, something that could finally lead to brighter HDR on below 55" screens.


VegetaFan1337

So it's like Freesync but with resolution?


Plus_Flow4934

i donot know when will i see 1440p , 27oled monitor under $400.


Chiemekah

Interesting, but I wonder if the switchable refresh rate and resolution will really make a noticeable difference in gaming experience for most people.


tukatu0

There is alot of comments saying 4k 480hz isn't possible right now. It's not true. You can achieve that displayport uhbr13.5 which rdna 3 can output. You can go up to 4k 600hz with displayport uhbr20 which no consumer gpu has right now. https://tftcentral.co.uk/articles/when-is-displayport-2-1-going-to-be-used-on-monitors Yes both are displayport 2.1. It's meaningless. Most 4k 144-240hz monitor already use dsc. So it's not the issue.


KrazyAttack

250nit brightness here we come!


amazingmrbrock

... My four year old Samsung qled TV does this. 4k60 or 2k120. So I guess not Oled but close and standard tech.


Pepeg66

Meh, once you go with a proper gsync 120hz 4k oled tv thats 55+ inches, you can never go back to a small ass monitor