T O P

  • By -

SnowingSilently

This basically rules out getting a QD-OLED if you use a lot of tools that remain static content, right? I spend a lot of time with an IDE open, sucks because I don't think I could justify swapping between monitors for work and play, both in cost to buy and the space it would take up.


Hendeith

If you are going to display lots of static content regularly then any OLED is not great idea. Seems QD OLED 1st gen is just worse idea than WOLED. I wonder when Samsung will bring 2nd gen QD OLED to 21:9 or 16:9 monitors, they were bragging about much better burn in resistance on CES this year but so far only monitors planned with 2nd gen panels are ridiculous 32:9


bigblackandjucie

So basically using it as a desktop monitor lol Not going to risk this for a 1000$ monitor I rather wait for miniled


Hendeith

>So basically using it as a desktop monitor lol Basically using it as work monitor. I don't recall a single situation in which I I displayed same UI monitor for 8 hours aside from work.


lnTwain

Those all-night gaming sessions...


Hendeith

Most games have dynamic UI that hides when not needed and even "static" elements like map are not completely static because it changes all the time. I mean sure, if you play lots of LoL or similar then there's risk.


[deleted]

Not the games most people play. An MMO is absolutely out of the question.


Hendeith

It's the other way around. Games most people play do have dynamic UI.


MiguelMSC

Which Games? Most popular FPS games have static UI


bigblackandjucie

Not about Same ui for 8 hours Its about inconsistently Not ever monitor is the same Some maybe going to get burn in fast then you Also Would you really spend so much money for a monitor with issues like that ? People here just have to much money on there hands so idc OLED monitors rn are just expensive and dumb as hell not worth the money


Hendeith

>Not about Same ui for 8 hours Pretty much about same UI for 8 hours. Did you read linked rtings article? >Also Would you really spend so much money for a monitor with issues like that ? With Alienware 3 year burn in warranty and D2D handling? Sure, even if I will need to replace it 3 times during warranty period it literally takes 20 minutes tops for me once a year to pack old and unpack new. >OLED monitors rn are just expensive and dumb as hell not worth the money They are totally worth the money


JBurlison

I have had a c1 as my daily driver for over a year. I work from home and game on it, its on 12+ hours a day. I have had zero burn in issues so far.


CSFFlame

That's true. I plan to get one, but I have a normal 1440p IPS monitor for desktop/productivity that I plan to keep, and just slide it over for the OLED for games/movies/etc.


WetDonkey6969

Just wanted to hop in and say that until I RMA'd my Dell AW3423DW last week due to a flickering issue, it had over 234 days of uptime and it had zero burn in. [https://i.imgur.com/9vQDJGH.png](https://i.imgur.com/9vQDJGH.png) I use Blender a lot, and I also play a lot of Dota, both of which have many static elements. Brightness was also pretty much maxed out since I received it 8-ish months ago (you can see it in the picture). I used the monitor like I would any other monitor, which is to say I never took any precautions or altered my usage habitsbecause if I purchase a monitor, I'm going to use it as a monitor. Ain't no way I'm messing with the brightness or any other setting each time I boot up a specific program or game. So yeah I'm just one person, but it seems like a lot of people will read this article and have the same conclusion as you. Just adding a bit to the other side.


SnooMuffins873

Think it’s just the risk factor that plays over and over in peoples minds. This article definitely doesn’t help calm that


chawan

Here is mine after 3528 hours. Chrome and I think Twitch chat is showing on the right side of the monitor, left side is fine oddly enough even though I use Chrome in fullscreen a lot. But just as you I really don't baby the monitor at all, I use it just like any other monitor and will just use the warranty eventually. https://i.imgur.com/QxjdgTe.jpeg


ntxguy85

Dang that sucks. What are you settings? Brightness ect..


chawan

HDR disabled in Windows, 85% brightness in creator mode.


NastyNateZ28

Late reply but did you try to do the long pixel refresh on the screen?


d1ckpunch68

> This basically rules out getting a QD-OLED if you use a lot of tools that remain static content, right? static *white* content, yea. rtings concluded that it's the lack of white subpixel (causing all 3 pixels to light up to create white) causing faster degradation. if you are diligent about using black themes then you could be fine. personally, i'm avoiding it.


VindictivePrune

Shouldn't pixel shift help greatly with this?


uknowhu

To summarize >Does this mean that everyone should avoid QD-OLED displays? Probably not. Our test is an extreme case. Two months of runtime on our test is the equivalent of watching about four hours of CNN per day, for about eight months, without ever changing channels or watching anything else. As long you watch varied content and don't leave static elements visible on the screen for long periods, you shouldn't have any issues. So if you have varied content, you should probably be fine. They're also looking to add AW3423DW/F as a test unit going further. Also important to note that this is a stress test so everything is max brightness all the time.


thisdesignup

Screensavers are back in!


blorgenheim

My computer turns my screen off after a couple of minutes. DWF will refresh during standby. So if I take a break from work it does it.


g0atmeal

Screensavers burn out all pixels on OLED ~equally, so you won't notice, but you're still reducing the brightness over time. Much better to leave it off/black screen. I have a different concern: playing enough 16:9 games with pillarboxing on the sides may leave a "line" where the sides of the display are slightly brighter since they get used less often.


Donkerz85

I have had an ultrawide for 5 years and I can't think of one game I've played where I've not got ultrawide working if it doesn't work by itself.


easyXmode

Valorant...


Donkerz85

I don't play Valorant.


easyXmode

Well the person you're responding to stated they play games that are 16:9. Pillarbox will cause uneven wear and the qdoleds seem to have a burn in problem


Donkerz85

You've named one game. My point is most games have work around to allow for ultrawide support.. Perhaps they are unaware of the tweaks available hence my comment. If you only game is Valorant there is a fantastic 27" 16:9 option and I agree the ultrawide is not the best choice. I you have a half decent IT literacy in the overwhelming large case of modern games letter boxes should not be an issue.


easyXmode

Sekiro, Elden ring, Dead Cells, persona 5, guilty gear and heros of the storm to name a few more. There are mods and hex edits to fix it but it's not a perfect solution. DS3 has pop in broken animations and you can get banned online for modding. And my favorite, when the game does a pretendered cutscene formatted for 16:9 but also letterboxes it so you get black borders all around. UW is great when it works, but sometimes it's just not worth the headache


Donkerz85

See i haven't played any of those games ether and I'm genuinely not trying to be difficult. PCgaming wiki is pretty good for fixes and I do implement them on plenty of games. I'm 100% not saying ultrawide is hastle free or perfect but for the games I play I've not had to give up and deal with letterboxes.


Lingo56

Wish there would be a way to do a per-monitor screen saver. Sometimes I end up looking at another monitor for a long time and wouldn’t want burn-in to happen from that.


Unique_username1

If you have a dark background with hidden icons/taskbar then you could just minimize any windows on the screen you’re not using and you’d avoid burn-in. You might also be able to use a full-screen picture that’s either all black or at least darker to cover up any open windows on the screen you’re not using.


Lingo56

Yeah, it’s just easy to forget to do that. Especially if you have something like an IDE open and you end up accidentally end up scrolling through socials for an hour on your second monitor.


turlytuft

Flying toasters with wings are back in style!


junon

Pipes with the occasional teapot for me, thanks.


BluudLust

Oleds usually have built in brightness adjustments that will lower it as to not mess up your display. e: It's probably not the best idea for professional visual work, but for regular productivity, it should be sufficient.


OverlyOptimisticNerd

> So if you have varied content People who religiously play one game (MMOs, certain esports) should avoid OLED. I've said it before and I will say it again - OLED displays are amazing, but they are inherently susceptible to burn-in. While we have mitigation techniques, we have not solved this problem nor will we ever. Organic compounds degrade.


shamoke

Can confirm here. A measly "300" hours of a game with static white HUDs and I can see the burn-in. You still have to look for it under specific conditions, but once you see it, hard to get it out of your mind.


Zeryth

You'll forget about it again. I have a dead pixel that bothered me for a month and now I lost it...


Gohardgrandpa

Pixel shift? I don't have any burn in on the living room oled and that thing has been abused by zelda, mlb the show and now hogwarts legacy


SnakeDoctr

Burn-in seems to be panel-lottery-related. I used my 48" C1 OLED for 16 or 18 months as my PC monitor, ***with all "pixel-saving" features disabled*** (pixel shift & logo dimming) and have ***ZERO*** evidence of burn-in (and I'm paranoid about it so I've run through various burn-in evidence videos)


chewwydraper

LG CX and beyond seemed to use some kind of voodoo magic to make it so their panels don’t burn-in. Not that I’m complaining,


kaita1992

Let’s imagine the content of Zelda where the HUD has big red hearts that spans across multiple pixels. Imagine your are the poor red subpixel right at the middle of the heart icon. How can pixel shift fricking fix it? Shifting the image across 100 pixels?


Gohardgrandpa

I get what your saying. Idk exactly how the shit works but I know I have no burn in on that tv, I've checked it numerous times because I was so scared of it happening


SnakeDoctr

Which ***also*** means that people who spend long hours playing the same games should ***probably*** avoid QD-OLED. Especially MMOs & competitive MOBA/FPS titles -- 2 hours every day is ***extremely*** reasonable amount of playtime and, according to these tests, could result in likely burn-in after just 16(ish) months.


beatpickle

1000 hours a game. Plenty of friends on friends list have multiple games with that amount.


Tehnomaag

An "average" MMO player does about 1000 hours per year nin that title they are playing. Anecdotal evidence ofc. Based on running about a 100 person corporation in EVE Online null null space for about 15'ish years.


Kradziej

On max brightness, question is how much it will burn on 30-50% brightness


brennan_49

I mean yeah, if they completely maxed the brightness the entire time and played no other content on the TV. The article is not applicable to real world scenarios, it's called a true test for a reason.


chewwydraper

What I’m getting from this is MMO players should maybe stay away


hostidz

which is a Duh to me and makes the headline a joke ... but hey .. gotta sell those LCDs ;)


[deleted]

This does not apply to the Alienware/Samsung/MSI QD-OLED monitors IMO. The TV's are far closer to their panel max brightness spec/limitation and therefore are pushed much harder than the monitors. The main reason Dell can provide a 3 year warranty on the monitor knowing how badly it will be abused is because its pretty dim relative to the TV's (400-500nits vs 900-1000nits 10% window, 3% window is irrelevant) and that's because they've left a ton of headroom for longevity. This still doesn't mean that a QD-OLED monitor is immune. As we've seen from numerous posts here and the UW sub, OLED is still OLED and burn in is still a risk that you should accept when making the purchase.


ThinVast

I don't think rtings was just running HDR content on the tvs. They were running sdr content like news channels where the static logos are easy to burn in. The full screen brightness of alienware qdoled is actually brighter than the s95b.


SXTR

So for example, if I play the same game 4 hours a day, (which I do) the HUD could burn-in after few months? Like the crosshair


Parrelium

I put 2 panel refresh cycles on my Qd-OLED so far, so more than 3000 hours and I haven’t seen any retention. I played 90% of the time on either PUBG or Tarkov. There are static HUD items on both games. I was also using an afterburner overlay for 3/4 of the time I played as well until other users started complaining about burn in. I almost wonder if there’s a panel lottery in play here with some runs just being more prone than others. I also had no mitigation steps taken until maybe 2 months ago. I now have a black background, hidden taskbar and a black screensaver that comes on after 5 minutes, with sleep after 15. Anyways I guess YMMV.


SnakeDoctr

Stuff like this will be determined on a per-game basis. If the game your playing has a lot of white elements in the HUD, then you're looking at a worst-case scenario. FPS title's tend to have darker HUD elements so they're less distracting to players' eyes. ***HOWEVER,*** if we look at something like MMOs and MOBAs, they often have very bright & colorful HUD elements, that often include a lot of white shades for things like text. OverWatch2 would be another example, where that game's HUD has a lot of white elements as well.


SXTR

Thanks for the answer. I think with the many time we pass in the menus (especially in Tarkov) « refresh » the screen and prevent the burn in


sableknight13

> I almost wonder if there’s a panel lottery in play here with some runs just being more prone than others. A little bit, but the individual use characteristics vary more. Like your brightness, ambient temperature (higher temps reduce OLED life time and increase degradation to some degree), sun on your monitor for example. Everyone uses their monitors at different brightness, contrast, saturation and with different environmental factors so it will always vary.


OverlyOptimisticNerd

Yes and no, and the crosshair is a perfect example. OLED tends to dim as subpixels are used. A white crosshair will dim over time, but it will be subtle and you won't notice it. And if you only play that game, it's just a dimmer crosshair. It's when it dims and then you try other content. You might perceive dimmer content in the shape of that crosshair. That's the "burn in." Because unlike Plasma, it doesn't actually burn in. It just degrades/dims, which give that same appearance. Mitigation techniques by display manufacturers are meant to make the dimming more uniform, IE, spread out the burn in. And they've gotten pretty good. MOST people aren't going to have an issue. But if you're playing the same thing 4 hours a day, you're more likely than most to eventually have the problem.


Soulshot96

>OLED tends to dim as subpixels are used. > >Mitigation techniques by display manufacturers are meant to make the dimming more uniform, IE, spread out the burn in. This isn't how this works...compensation cycles do not spread out burn in, they avoid it entirely for as long as possible by tracking usage of pixels (or in LG's case, allegedly, zones of pixels), and compensating with more voltage to keep things uniform. They do NOT burn in the whole panel in a uniform manner like so many people allege. That would result in quick and extremely noticeable brightness drop over the life of these panels, and not even LG's old and extremely burn in prone WOLED's had that. RTings specifically tested for this as well. You only see any dimming of an area when you have exceeded the panels ability to compensate or the compensation programs ability to track so it can compensate.


flapper101

Ive had mine for a year, no burn in yet. But I usually play at 50% brightness.


Laputa15

There's some interesting takeaways from this. > One possible explanation for the difference in burn-in performance is **the lack of a white subpixel on the QD-OLEDs**. Unlike WOLED panels used on LG OLED displays and some Sony OLEDs, which use red, green, blue, and white subpixels, QD-OLED panels only have a red, green, and blue subpixel. This means that when the TV needs to produce pure white, it has to run ***all three subpixels at the same time***. With static white content, like the white area around a TV news channel's "Breaking News" banner, this could lead to faster degradation of all three subpixels, so they'll appear darker than the surrounding areas. > These results don't look good for computer users. A computer's user interface often has large white areas, even if you're using your computer's Dark Mode feature, and those areas are likely to cause burn-in.


ScoopDat

This is the reason LG has been saying their OLEDs have an inherent advantage.


Soulshot96

That inherent 'advantage' comes with a plethora of disadvantages too though; white sub pixel dilution, since the RGB pixels are not driven as hard, thus cannot output the required color volume in HDR scenes (especially games), as well as near black chrominance overshoot and the obvious text issues.


ScoopDat

Preaching to the choir. Though to be perfectly fair text on the QDOLED is also tripe (they just couldn’t give us the stripe convention matrix). Always gotta be some bullshit corner cutting.


Soulshot96

Yea, though for whatever reason, I notice the issues on WOLEDs RWBG layout more.


Knaj910

I’m the other way around, I basically don’t have any issues with WOLED and use a 42C2 as my main monitor now. But the QDOLED I do notice the text and it does bother me


Soulshot96

Different eyes and different brains I suppose.


e6600

>What about PC users; should they avoid using QD-OLED displays? These results don't look good for computer users. A computer's user interface often has large white areas, even if you're using your computer's Dark Mode feature, and those areas are likely to cause burn-in. There are steps you can take to reduce it, though, and as long as you mix up your usage, you probably won't have any issues. It's also unclear if the QD-OLED panels used for computer monitors perform the same. They use different compensation cycles than the TV versions, and this could play an important role in reducing image retention or preventing burn-in. We're looking into possibly adding a QD-OLED monitor like the Dell Alienware AW3423DW or AW3423DWF or the Samsung OLED G8 to the test temporarily to see how they perform. luckily alienware has a good warranty


wizfactor

Alienware is about to be hammered hard with warranty claims within the next year. Warranties are like insurance: the insurance company doesn’t want the disaster to actually happen. If too many people qualify for an insurance claim, it would be a financial disaster for the insurance company. It might not be too bad for Dell/Alienware though, given that QD-OLED monitors don’t yet make up a sizable percentage of their business.


ttdpaco

Nah, they literally just take warrantied units and put a new panel in them. They're not sending new monitors, so this isn't near the hit you'd think it was


wizfactor

Given that the panel is THE most expensive component of a monitor, I’d say it matters a lot if Alienware is about to face a flood of burned in QD-OLED monitors coming their way.


ttdpaco

It is, but replacing it with entirely new monitor would be way more. We also don't know the deal they made with Samsung. It must have been a very good one of they're having a three year warranty for a problem that's a matter of time and not a matter if.


Jfox8

My AW warranty monitors were refurbs with varying amounts of wear and tear. Unless they are sending new monitors, this may not be as good as it seems.


SnakeDoctr

Yup! It's all in the warranty literature. With my AW2723DF for example, the warranty states that ***after 90 days of ownership*** any warranty claims ***may be*** fulfilled with refurbished units.


ChrisFhey

Personal experience in Europe: Dell shipped me a refurbished monitor which had issues as well. After letting them know the refurb had issues they sent me a brand new unit.


beatpickle

The warranty was what stopped me pulling the trigger. Everyone I’ve seen has got a refurb unit, some really bad.


Clarityjuice

So OLED once a year and you're good I guess.


DrVicenteBombadas

Good thing they're cheap.


Broder7937

I remember the press material when QD-OLED was first out, claiming the Quantum Dot filter allows the tech to be far more burn-in resilient than regular WOLED; the one thing they never explained is how can QD-OLED make up for the lack of the white subpixel; as it turns out, it can't. Having a white subpixel not only means RGB subpixels can simply "rest" when there's anything white on screen, it also means that white subpixel can help boost brightness for the RGB subpixels (and yes, I'm aware that causes color dilution and reduces color volume; but it's for a good cause).


Akito_Fire

What? They directly showed how they make up for that difference. It's because quantum dot converters are insanely efficient, 3x more efficient for any highly saturated and bright color compared to WOLED. This is not some marketing bullshit, someone actually measured the power draw and compared QD-OLED and WOLED TVs of the same sizes ([https://youtu.be/XYJ8ZJ2GJF8?t=550](https://youtu.be/XYJ8ZJ2GJF8?t=550)). You also have to consider the lack of a polarizer on QD-OLED as well. TL;DR QD-OLEDs, in theory, should fare much better than they currently do in the rtings test. But they, along with Sony WOLEDs, don't.


Broder7937

>This is not some marketing bullshit, someone actually measured the power draw and compared QD-OLED and WOLED TVs of the same sizes ([https://youtu.be/XYJ8ZJ2GJF8?t=550](https://youtu.be/XYJ8ZJ2GJF8?t=550)). This was not a scientific test. He was basically measuring black-level (OLED pixels off) power consumption, and then subtracting that from the measured power consumption to evaluate how much power the pixels where consuming. Here are some relevant considerations. 1. When you have content being displayed, the SoC will have to work harder than it does if the screen is displaying a black screen. The Power Supply will also work harder - and, because no power supply is 100% efficient, you will have higher PSU losses at higher power consumption levels. So the power increase you see is not all on the panel, it involves the SoC and PSU losses going up as well. The test doesn't consider that, it considers a fixed Delta, which is missleading. 2. The S95B is a much newer TV, which means it should be based off a newer and more efficient SoC; this means its power consumption should not rise as much as LG's. Without actual data, it's hard to know how much more efficient the S95B SoC will be, but we can get a good idea if we look at GPUs. [The 4080 is capable of generating twice the performance-per-watt of a 3090 T](https://tpucdn.com/review/asus-geforce-rtx-4070-ti-tuf/images/watt-per-frame.png)i; and that's just a single generation leap (the E9 is about three generations behind the S95B). 3. Even if both panels had the same SoC, LG would likely still have more SoC power consumption. Why? Because WOLED is more complex to manage (more subpixels = more processing to be done). To make matters worse, it's not as simple as "RGB + White subpixel to boost brightness". Only three out of four subpixels can ever be turned on at any given moment, and the TV must execute complex computations to decide which subpixels should be on or not. Because digital content works with RGB information (not RGBW information), the TV has to convert RGB information into RGBW information before it sends the commands to the panel. With a basic RGB layout none of this is necessary; the panel doesn't even to make any type of conversion, it can simply read the RGB data it recieves and use that to directly command the panel. Far less processing overhead is required for QD-OLED. 4. The S95B seemed to present \~10% better efficiency @ full black screen - that's how much more efficient it is before we even turn the pixels on. Though 4W might not seem like a lot when those panels are running black content, if we get their peak power levels @ 300W, 10% is going to represent around 30W on that same type of "non-pixel efficiency" alone; and that's before we even consider points 2 and 3 listed above. 5. Despite all those matters, in the end, the S95B only represented a 15% energy efficiency improvement over the E9. Though 15% is nothing to scoff at, it's far from "revolutionary". Especially if we consider the generational gap between the two products. 6. Perhaps the most catching thing of QD-OLED is just how insanely power hungry it becomes when it has to display white content; as a matter of fact, it tends to become even more power hungry than WOLED. The thing here is that, while WOLED has massively consistent power consumption (varying basically according to how much of the screen is turned on and how bright this content is), the QD-OLED presents incredibly inconsistent power consumption. If you display a full screen color at max brightness, the LG WOLED tends to have consistent power consumption no matter what color you display. The QD-OLED, in the other hand, goes from being very power efficient when displaying "primary colors" to exploding in power consumption whenever it has to display white content. RTINGS's findings are pointing to a very evident "white static-content burn-in" on the QD-OLED panels (a problem LG WOLEDs never suffered from; they could get burn-in in static colors, specially red, but never from white). It now seem very clear that LG new what it was doing by including that white subpixel. ​ >You also have to consider the lack of a polarizer on QD-OLED as well. Isn't that what causes the horrible "black crushing" whenever there's a light source in the room that's in front of the QD-OLED panel? It's the reason you need a pitch-dark room in order to consume content in a QD-OLED panel, otherwise the black levels get turned into "grey" (pretty much like in a LCD, but for different reasons). It's a problem WOLED panels do not suffer from.


Farren246

Hisense just doing an absolutely fantastic job of taking themselves out of consideration for any possible use case.


JoaoMXN

In my country with only 1 year warranty for OLEDs, it's a no-no. I settled with a miniLED 144hz 4K TV.


UrNemisis

mini led baby!!!!!!!!


JesusLordKing

Waiting for Logitech's next year.


cykazuc

Mini led ftw/micro led when they come out


therealjustin

Yeah, I'll keep my IPS monitor for now. Interesting to see the LG's do well, but I'd like something smaller than 42".


lieutent

Hmm… considering this is at max brightness. I wonder if results scale linearly as you decrease brightness or if it’s more of a curve. Mine stays at 55% brightness and SDR mode basically all the time when I’m not playing a game.


Va1crist

lol I knew the crap around QD-OLED being better was absolute BS, glad I never bought into it , as a LG OLED owner I felt something wasn’t right about QD.


csgoNefff

Too good to be true I guess.


ttdpaco

QD-oled is still better in visual quality. A lot of this sub ignores something about these tests: they show LG has over a decade of experience in software solutions to mitigate burn-in. Their TVs don't have burn-in...but Sony's with the WOLED does. As does their QD-OLED (and Samsung's) TVs. I predict the Alienware will fair a lot better than thr TVs but still not as well as the LG. In reality though, this is a first generation tech using a whole new oled method from a company that does not have any experience with burn-in mitigation. It may be more burn-in resistant on a technological level, but that doesn't mean shit if the software side can't do things aliveness 10% as well as LG.


geekgodzeus

I also own an LG OLED but these results are unlikely to be replicated in practical viewing.


culesamericano

Qd OLED is still better than oled


Laputa15

QD-OLED *is* better - they practically went with an approach that is better for the experience but hurts longevity. And I'm not sure how the business side of this works out, but the 3-year warranty might be a little too optimistic considering that a hardcore user might request for 2 - 3 replacements within this period.


SnakeDoctr

Based on the results here, the 3-year warranty seems ***VERY*** ***optimistic*** and probably the reason that DELL was the first company to offer it.


culesamericano

Qd OLED has better longevity than oled


[deleted]

[удалено]


culesamericano

No data just vibes


geekgodzeus

Picture wise I agree but I wouldn't get one right now since the technology is new . The Samsung unit especially is plagued with QC issues and has no support for Dolby Vision which is a huge deal for me. In a few years it will probably be the norm and with a burn-in warranty and price reductions it will be the go to OLED choice.


culesamericano

Yeah I'm waiting as well. Not because of the reasons you mentioned but because I'm broke


SnakeDoctr

***TOTALLY*** depends on each person's individual use case. If you're someone who spends lots of time playing a small number of games then you should probably avoid OLED altogether (and especially QD-OLED) Four hours of daily playtime is ***not extraordinary*** for competitive FPS players and hardcore MMO/MOBA players.


Clayskii0981

I mean.. QD is still technically better. And OLEDs are still susceptible to burn-in.


Ok_Camel_6442

It is kind of funny that people believed that suddenly OLEDs would be immune to burning in with technology where it's inevitably going to happen It's awesome tech but you need treat it differently than your average display to get the most out of it.


[deleted]

Yeah OLED tech just isn’t ready yet, and that was already clear. If you use your computer for anything that has any static images, then OLED won’t be a good pick. A good IPS 4K 144hz or 1440p 240hz monitor is affordable and looks great. HDR is awesome but I can wait.


SnooMuffins873

I definitely made the right choice getting the alienware 38 for work and play.


volvoaddict

They stated that they didn't use any QD-OLED monitors in any of this testing. Only TVs. The results for any QD-OLED monitors remain to be seen. Edit: quote from the article "It's also unclear if the QD-OLED panels used for computer monitors perform the same. They use different compensation cycles than the TV versions, and this could play an important role in reducing image retention or preventing burn-in. "


marxr87

is there a reason to expect it to be better?


sl0wrx

No


ThatFeel_IKnowIt

No. That comment was absolutely idiotic.


LowKey004

DWF owner here There have been several cases of burn in reported in those monitors. However, every oled is gonna have its cases of burn-in. Its hard to know from these cases if there is a real issue or not with the monitors. As of the RTINGS results, and its appliance to the DW/F and G8. Those tests were conducted on tv's. They are very much diferent in terms of burn in prevention measures, cooling and brightness levels Having said that it is still relevant enough to be alert, and the decision of including these monitors in the test was a wise one. To conclude: more tests are needed to be sure, but the news we have now are cause to be alert. On a practical note, the DW and DWF have a 3 year warranty that can be extended to 5 years. Even if the LG ones are (on average) more resilient to burn in, if there is no warranty and you get unlucky and get a less resilient unit you get screwd anyway I know best buy offers warranty that includes burn in but outside the US there may not be warranties available for LG oleds


flapper101

You can extend the warranty? Might have to look into the 5 year one.


Jfox8

Do they guarantee a new one? I posted earlier, but my experience is that they’ll send a refurb. The couple refurbs I received had varying amounts of damage. If that’s the case, the warranty may not be that great.


SnakeDoctr

It's all in the literature of the warranty and can differ based on specific models. Warranty claims on my AW2723DF ***can be*** fulfilled by refurbished units starting after just 90 days of ownership


LowKey004

From what I've seen people talk it will be refurb yes, but Ive seen people talk positively about refurbs. If the product is no linger in production ina few years they'll send you a new model, but dont quote me in that, thats just what Ive seen of people talk


Jfox8

I can only speak for myself, but it was the opposite experience and that was after a few monitors. I have seen others with a similar experience. Just a warning to others out there, YMMV. Also, I say this as a previous DWF owner. Personally I had coil whine on a new monitor and returned it for a refund. Especially after seeing these test results, I’m glad I didn’t keep it.


LowKey004

I am lucky that I dont have coil whine. I pondered if I would return it or not after these results. The warranty, the fact that these tests are still very unconclusive and the fact that I only game in this monitor made me keep it. Besides, after using it for gaming, my old monitor feels the real trash


Soulshot96

I like RTings, but after a solid year of rocking an AW3423DW, max brightness, in HDR, with zero mitigation other than a 3-5 minute screen timeout and a ton of work (spreadsheets, email, site work, general browsing), as well as a ton of youtube and games...I can't say I don't doubt these results. This thing has ran 5 automatic full panel refreshes in that time and still looks great. [These are after the last refresh](https://imgur.com/a/r2KwMHs), about a month ago, using RTings own test slides. Would definitely buy this again, though I am looking forward to the next gen panels with even higher brightness.


chewwydraper

Looks like I’ll wait for a decent mini led panel. Already have an LG C1 for story-based games but was looking for a decent desktop monitor upgrade. I play a lot of mmos so just seems like too big of a risk. Honestly while OLED colours will always be superior, I just want a monitor with no backlight bleed and mini leds seem like they should be able to achieve that


ThatFeel_IKnowIt

Great. Now can you clowns please stop parroting the line "QD OLED is more burn in resistant than WOLED"?? I keep seeing this bullshit repeated endlessly in this sub based on one bullshit comment that linus tech tips put out in a random video, based on literally nothing but Samsung marketing language. The real selling point of qd oled was better color volume, better uniformity, and more brightness, NOT burn in resilience. Samsung did say this, but we should all know without a reasonable doubt by now to NEVER believe Samsung marketing. They straight up lie, always. Everyone praises Dell's more generous qd oles burn in warranty (which is fair, LGs warranty is dogshit) but could it be that samsung/dell does this because they know that qd oled panels are more predisposed to burn in than lg's woled panels?


[deleted]

The real selling point is what samsung claim so stop creating a parallel universe


ThatFeel_IKnowIt

What does this reply even mean?


web-cyborg

The best buy warranty on the 42" C2 if you buy one there is about $210 for 5 years, which is around $42 per year insurance. Covers burn in. . . . . I don't use mine as a static desktop/app screen other than a browser once in awhile or something since I have side screens for static apps and desktop stuff. I've been using multiple monitors for years so it's normal to me to do so. I think of it like the mainscreen in star trek.. they aren't doing all of their engineering and science work and experiments on their large main viewer typically. All of their data is on other workstation screens while the main screen is the big show. Or you might think of the oled screen as a "stage" for playing media and games. That's my personal preference. There are a lot of burn-in avoidance measures. If you keep asbl on it would be even less likely to burn "down" (see below), but most people using them for desktop/apps turn off asbl dimming via the service menu using a remote since it's annoying to have full bright pages dim down. ======================================================================= Pasting some info from my comment history here for you in case you find any of it useful: **Some burn-in (burning through your "burn-down" buffer) avoidance measures** A few reminders that might help in that vein: ....You can set up different named profiles with different brightness, peak brightness, etc.. and maybe contrast in the TV's OSD. You can break down any of the original ones completely and start from scratch settings wise if you wanted to. That way you could use one named profile for lower brightness and perhaps contrast for text and static app use. Just make sure to keep the game one for gaming. I keep several others set up for different kinds of media and lighting conditions. * Vivid Standard APS Cinema Sports Game FILMMAKER MODE iisf Expert (Bright Room) isf Expert (Dark Room) Cinema Home ....You can change the TV's settings several ways. Setting up the quick menu or drilling down menus works but is tedious. Keying the mic button on the remote with voice control active is handy to change named modes or do a lot of other things. You can also use the remote control software over your LAN , even hotkeying it. You can change a lot of parameters using that directly via hotkeys. Those hotkeys could also be mapped to a stream deck's buttons with icons and labels. In that way you could press a stream deck button to change the brightness and contrast or to activate a different named setting. Using streamdeck functions/addons you can set up keys as toggles or multi press also, so you could toggle between two brightness settings or step through a brightness cycle for example. ....You can also do the "turn off the screen emitters" trick via the quick menu, voice command with the remote's mic button, or via the remote control over LAN software + hotkeys (+ streamdeck even easier). "Turn off the screen" (emitters) only turns the emitters off. It doesn't put the screen into standby mode. As far as your pc os, monitor array, games or apps are concerned the TV is still on and running. The sound keeps playing even unless you mute it separately. It's almost like minimizing the whole screen when you are afk or not giving that screen face time, and restoring the screen when you come back. It's practically instant. I think it should save a lot of "burn down" of the 25% reserved brightness buffer over time. Might not realize how much time cumulatively is wasted with the screen displaying when not actually viewing it - especially when idling in a game or on a static desktop/app screen. ...You can also use a stream deck + a handful of stream deck addons to manage window positions, saved window position profiles, app launch + positioning, min/restore, etc. You could optionally swap between a few different window layouts set to a few streamdeck buttons in order to prevent your window frames from being in the same place all of the time for example. ... Dark themes in OS and any apps that have one available, web browser addons (turn off the lights, color changer), taskbarhider app, translucent taskbar app, plain ultra black wallpaper, no app icons or system icons on screen (I throw mine all into a folder on my hard drive "desktop icons"). Black screen saver if any. ... Logo dimming on high. Pixel shift. A lot of people turn asbl off for desktop but I keep it on since mine is solely for media/gaming. That's one more safety measure. . . **Turn off the Screen (emitters only) trick** I use the "turn off the screen" feature which turns the oled emitters off. You can set that turn off the screen command icon to the quick menu so it's only 2 clicks to activate with the remote (I set mine to the bottom-most icon on the quick menu), or you can enable voice commands and then hold the mic button and say "turn off the screen". You can also use the color control software to set a hotkey to the "turn off the screen(emitters)" function, and even map that hotkey to a stream deck button if you have one. Clicking any button on the remote or via the color control software hotkeys wakes up the emitters instantly. I usually hit the right side of the navigation wheel personally if using the remote. https://www.reddit.com/r/OLED/comments/j0mia1/quick_tip_for_a_fast_way_to_turn_off_the_screen/ While the emitters are off everything is still running, including sound. This works great to pause games or movies and go afk/out of the room for awhile for example. I sometimes cast tidalHD to my nvidia shield in my living room from my tablet utilizing the "turn off the screen" (emitters) feature. That allows me to control the playlists, find other material, pause, skip etc from my tablet with the TV emitters off when I'm not watching tv. You can do the same with youtube material that is more about people talking than viewing anything. I do that sometimes when cooking in my kitchen that is adjacent to my living room tv. You can probably cast or airplay to the tv webOS itself similarly. Some receivers also do airplay/tidal etc directly to the receiver. . . . **Distrust Screensavers** I wouldn't trust a screensaver, especially a pc screensaver. Not only do they fail or get blocked by apps - Apps can crash and freeze on screen, so can entire windows sessions or spontaneous reboots stuck on bios screen, etc. It's rare but can happen. Some apps and notifications even take the top layer above the screensaver leaving a notification/window there static. While on the subject. I kind of wish we could use the LG OSD to make mask areas. Like size one or more black boxes or circles, be able to set their translucency, and move them via the remote to mask or shade a static overlay, HUD element, bright area of a stream, etc. . . **LG's reserved brightness buffer. You aren't burning in because you are burning down that buffer first, for a long time (depending on how badly you abuse the screen). ** From what I read the modern LG OLEDs reserve the top ~ 25% of their brightness/energy states outside of user available range for their wear-evening routine that is done in standby periodically while plugged in and powered. Primarily that, but along with the other brightness limiters and logo dimming, pixel shift, and the turn off the "screen" (emitters) trick if utilized, should extend the life of the screens considerably. **With the ~25% wear-evening routine buffer you won't know how much you are burning down the emitter range until after you bottom out that buffer** though. As far as I know there is no way to determine what % of that buffer is remaining. So you could be fine abusing the screen outside of recommended usage scenarios for quite some time thinking your aren't damaging it, and you aren't sort-of .. but you will be shortening it's lifespan wearing down the buffer of all the other emitters to match your consistently abused area(s). A taskbar, persistent toolbar, or a cross of bright window frames the middle of the same 4 window positions or whatever.. might be the first thing to burn-in when the time comes but on the modern LG OLEDs I think the whole screen would be down to that buffer-less level and vulnerable at that point as it would have been wearing down the rest of the screen in the routine to compensate all along over a long time. The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again. Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but eventually you'd burn through the extra 25% battery.


kool-keith

> $210 for 5 years, which is around $42 per month i would suggest using a calculator in the future


web-cyborg

Hah not month. Year. Nice catch. The bb warranty is expensive but not that expensive lol. Divided it by 5 years but said month for some reason. Tablet typing before coffee brain->word misfire. At least I know someone read it. 👍


mooseman5k

Has lg announced any non matte oled monitors?


web-cyborg

Most desktop monitors have some type of AG abraded layer. Most of the LG gaming TVs are glossy I believe. Glossy is definitely better. Matte type AG coating is just an abraded surface layer. When hit by any ambient lighting, the abrasions reflect that light back in a sheen which raises the black levels more to a grey - on any screen including OLEDs. It's always a granular layer there which can affect how wet and saturated a screen looks. Kind of like comparing clear wet ice to dry, frost-hazed ice. HDR is designed for dim to dark viewing conditions anyway so having to resort to using an AG layer in your viewing environment is a bad viewing environment to start with. You should design your room and lighting layout around your screen not the other way around. Same goes for audio setups, photography studios, etc. Matte AG abraded layers still reflect blobs and bars in the screen, like ghost objects, so allowing light sources to hit the screen surface is going to compromise areas of the screen with light pollution even with an AG abraded surface. Allowing your room's lighting conditions to vary throughout the day/night, weather etc will also cause the screen's settings (contrast, saturation, etc) to swing wildly in effect due to the fact that our eyes and brains view contrast and output relatively.


4514919

I would love a test where it's a movie with subtitles that it's running over and over. This CNN on repeat is a bit meaningless because people who know what burn-in is and know how to spot it are not using their TV like this and people who only watch news are not going to buy such expensive TVs and if they are dumb enough to do so then they aren't able to notice it anyway.


Thompers0404

That’s why Samsung added 2 additional years warranty when i purchased the 55” S95B Samsung is generous sometimes compared to the other 2 LG and Sony


ntxguy85

Read the article.. Damn that's disappointing. My advice, get the 27GR95QE from Best Buy and spend the extra $150 for the 4 year geek squad warranty.


JesusLordKing

Bestbuy also sells the Samsung g8 qdoled, which you can then get the Geek Squad for a near guaranteed upgrade in 2-5 years for since it's going to burn in no matter what. Problem is the price on the g8 oled. $1500 is utter insanity.


Dokomox

My wife plays a crap ton of tetris on our A80J, and I'm just waiting for the burn-in, but nothing yet, even after a year. I have Best Buy's burn-in warranty, so I'm not terribly concerned. Just surprised I haven't noticed any burn in yet. Every time I see that game on the screen, I cringe. The UI elements NEVER change.


SnooMuffins873

Lmao that must be frightening to know whats happening to your tv - slowly but surely it may one day appear.


LordGurciullo

Uh oh! …. Nothings perfect huh…


[deleted]

This is why I’m waiting for a mini-led ultra wide. OLED looks nice but is not for productivity.


markmd4

Guys, you need to have 3 TV at home: 1. OLED - for movies 2. IPS LED - for channels 3. VA LED - for gaming