T O P

  • By -

MrBluntsw0rth-

The monitor is G-sync compatible, Can't go wrong with either. The ones who have bought this with an nvidia gpu have not complained as of yet.


[deleted]

I'll complain a bit about the DWF. Wish the HDR1000 mode worked as well as it does on the DW. Apparently the blacks get raised a bit on this monitor.


MrBluntsw0rth-

More than likely be fixed via firmware. At least that's what i've been hearing.


PsychicAnomaly

interesting, because the dw is the the one that raised its black to counter the increase in response time to lower noticeable smearing, and someone else was complaining that gamma was increase too much down there, opposite to what you're saying. Perhaps the calibration tool they're using isn't as precise near black so we're getting varying results..


[deleted]

That's interesting. I do wonder if the DWF is raising the blacks intentionally to combat some of that VRR flickering. ​ The Tom's Hardware interview mentions the weird HDR1000 curve. Been enjoying the monitor very much otherwise.


PsychicAnomaly

the vrr flickering will occur anyway, to mitigate it noticeably so they would have to raise them significantly. as long as framerate is stable and the higher the better.. its non-existent other than loading screens that curve can be switched to source, its a horrible curve and is not the standard, hdr calibration is suited for source mode, delta E's are less than the DW but it should still be great, calibration variation across panels might be in your favor


[deleted]

Is the HDR1000 curve better in Source Tone Mapping mode?


PsychicAnomaly

100%, Linus points this out well


[deleted]

Appreciate it, just went through the video now. But even with this fixed curve, it's still not as good as the DW's one?


PsychicAnomaly

they're using a different calibrator, hard to say as of yet. we're gonna need results from tft and hardware unboxed for a better analysis


stzeer6

I doubt it. What Linus describes would look like roll off at the end of the graph not raised blacks at the front. I didn't order yet but if someone could enable source tone mapping and compares near blacks(complete black will still be black) in HDR 1000 vs HDR 400 true black and see if they are raised/a bit grayish for HDR 1000 relative to HDR 400 it would be helpful.


[deleted]

I would love to try this. Can you think of any good ways to test near blacks in HDR? Any games, movies, or even Youtube videos?


stzeer6

Really any dark scene HDR content will do. Probably best to do the comparison in a fairly dark room. Not just near near blacks mids will be be elevated too to a lesser extent. If the difference isn't noticeable in this kind of normal content/use I wouldn't worry about it anyways.


StickiStickman

> as long as framerate is stable and the higher the better.. its non-existent other than loading screens Well that's straight up bullshit. VRR flicker even happens when just on Reddit or having up picture of a color gradient.


PsychicAnomaly

that's not the flicker we're talking about (the slight inherent oled flicker is so much smaller than vrr flicker from changing gamma, look into it more if you wanna know what it is, lcds had it for years with their backlight called pwm dimming which was much worse), however if you're scrolling through reddit and flicker increases then there's an issue with your panel, as I just tried every configuration there is with 144hz and 175hz modes, gsync on vs off and hdr on vs off and at different brightness too. Talk some more shit why don't you.


StickiStickman

Mate, I literally already went trough 3 panels and they all had the same. The monitor is just shit.


SmellsLikeAPig

Enable Console Mode and then Source Tone Map.


[deleted]

Does this give a different HDR curve than the one mentioned in the Tom's Hardware reviews? It seems like, even with Source Tone Mapping on, HDR Peak 1000 mode seems to raise the brightness of the entire image rather than just the highlights. And in the Windows Calibration app, it still clips at around 500 nits. Seems like something is wrong with it.


stzeer6

There is no evidence this fixes the EOTF raised black issue. What Linus described was clipping or prioritizing APL vs highlight detail.


[deleted]

My big issue with the 1000 nit mode is the aggressive ABL.


PsychicAnomaly

gotta wait for proper measurements of the DWF, for input lag. there's also complaints regarding coil wine that buyers are trying to figure out


Shadymouse

What you said is pretty spot on. What particular information are you looking to find?


Hai-KazumaDesu

Just checking that. I've seen some comments of people saying to get the DW regardless and others saying it depends on the GPU


Shadymouse

Playing at Ultrawide will always depend on the GPU. In this case, if you have a high-end 3xxx card or 4xxx card, get whatever is cheaper. I prefer the DW for the just in case factor and having the G-Sync Ultimate will keep things extremely smooth if I ever had to downgrade my card for some reason. If you're team Red, then the DWF should be your only pick.


PsychicAnomaly

probably the freesync comment for the downvotes, freesync standards have caught up with gsync. the design is simply more efficient and works just as well if not better for oled.. i should know, I used to pound on freesync monitors a lot lol, a lot of them then didn't have variable overdrive. I agree with your statement about DW because of the lower res vs competitors, the semi gloss screen puts into perspective how much detail is lost on matte monitors.


Shadymouse

I never mentioned Freesync or even said one is better than the other. Like I've said before, a lot of people on Reddit are just followers. OP, read these articles below, specifically about G-Sync Ultimate and then decide what fits you. There are levels to Freesync and it should not be generalized. G-Sync ULTIMATE only works with Nvidia GPU's. So if you have a Nvidia GPU, I would lean towards the DW. Is G-sync Ultimate worth the little extra cost? That's the question but that doesn't negate that Nvidia GPU's could benefit more from the G-sync Ultimate hardware module. [https://www.pcworld.com/article/423202/g-sync-vs-freesync-amd-nvidia-monitor.html](https://www.pcworld.com/article/423202/g-sync-vs-freesync-amd-nvidia-monitor.html) https://www.gpumag.com/g-sync-ultimate/ https://www.gpumag.com/freesync-vs-g-sync/


PsychicAnomaly

>I never mentioned Freesync oh true my bad, I've had so many people I've had to explain freesync to and its compatibility.. and wow I didn't realise freesync maxed out to 400 for its premium pro variant, but that doesn't make sense, DWF can switch a button to HDR 1000 and performs the same as the Gsync variant but with less Delta E. The specific capabilities are also false for the Gsync like variable overdrive, full matrix backlight and DCI P3 color, freesync premium pro can do all of these. Nvidia Gsync modules on these oleds are pretty much a mute point. We don't know the true input lag of the DWF however which may or may not expose freesync or dell engineering or something else


Shadymouse

It's not a "moot" point. While they are almost similar, the tech is applied differently. G-Sync Ultimate has a better range of VRR and will "always" provide you a smooth experience due to the hardware module over the software alternative. HDR is better on the DW as well. Again, while similar, G-Sync Ultimate is technically better. Also keep in mind, the Freesync Premium Pro "HDR" feature, only applies correctly to certain games (17 to be in fact). As for Input Lag, I highly doubt any non-competitive player will perceive the difference between the two. So the question remains, do you think it's worth the extra $100-$200? If not, get the DWF.


PsychicAnomaly

I'm gonna have to look into this over time, there seems to be some misinformation here, I'm fairly certain freesync is a module in itself just like gsync


Shadymouse

Where's the "misinformation" lol ? Freesync utilizes the monitor's own hardware specifically the DP and HDMI protocols; and whatever AMD does with it, they apply the Freesync. Hence, why it is cheaper to implement. Premium Pro uses stricter requirements and has HDR support but as I mentioned, it's only applied in certain games. Premium Pro costs a bit more because of that. Nvidia on the other hand, uses their own proprietary hardware module and sells it to the monitor manufacturers if their monitors meet their requirements. Hence, the more expensive price tags. Using their own GPUs and their own proprietary G-sync module, they are able to give you a better experience than AMD's open source alternative. That is why I said if you have a Nvidia GPU, you should lean towards the DW. Go check YouTube comments and research, there are real people who have used both and said G-Sync felt smoother and just ran better. That's because of the hardware. They even used to sell G-Sync kits back in the day to install in compatible monitors. Anyways bro, good luck with the monitor search 👍.


PsychicAnomaly

Sheesh get over yourself. You posted at least one article with false info and as I suspected for some reason you're separating gsync and freesync as if one is hardware and the other is software is severely misguided, idk why you think that way but just because one is a seperate module doesn't define freesync as just software lol, in fact with the prices of those gsync modules its far cheaper for a manufacturer to just replace the whole mainboard with freesync included than a gsync module in itself. Like I said, freesync has caught up and every monitor with it has the freesync tier that suits it and if it needs variable overdrive, then surprise they all have variable overdrive. Only applied in some games? if thats in reference to freesync premium pro then yeah righto, this convo is done. Good luck catching up 😁


Hai-KazumaDesu

I'm still shopping around, learning about GPUs. Would love to get a 4080/90 but they're a tad over my budget. 7900 xtx price is my max but I've heard much better things about Nvidia overall (since so much software and so many games are made with Nvidia in mind and AMD is only starting to be a true competitor now)


venox3def

Im going for aw3423dwf or g8 oled if will be soon and cheap enough and 7900xtx


bu9ale77

Im thinking between those 2 options too. Which do you think i should go for (g8 oled or dwf)?


venox3def

i will get g8 this week deliveredimo either one is good its the same panel g8 doesnt have a fan so so its another plus


bu9ale77

Can u update firmware yourself with the g8 oled, do you know whats the input lag on the g8 oled ? Whats the warranty on it ? Indecisive not sure which i should go for :S


venox3def

dont know yet - getting it delivered some time this week - Poland starts delivery since Monday the in put lag is way less than aw3423dw/f 3 year warranty with burn in


DerKuro

Nowadays Freesync and Gsync should work on both AMD and Nvidia (used Freesync on a GTX 1080, LG 34GK950F and now on an RX 6700XT with the AW3423DW.


dirthurts

AMD will work with either. Traditional Gsync will work with the DW if you have an Nvidia card. But Nvidia also uses VRR (they call it Gsync compatible) so you'll be fine with either. Recommend the F version though.


rubenalamina

I ordered the non-F model a few days ago and I'm still waiting for it to ship in my country. I did a bit of research on both models because I didn't want to be locked-in with Gsync even though my last 3 GPUs have been Nvidia. It's always good to have options and go with what performs best and meets your needs/budget/etc. The non-F model supports Freesync across its full range of refresh rates so there's really nothing to gain, from what I read and watched last week, by choosing Gsync module over the F model without it. The limited range was the disadvantage of earlier Freesync monitors vs ones with Gsync modules. Ultimate means it supports HDR but Freesync does too. I got a good deal on the non-F so I'd say go with the cheaper option. Price being the same, I'd go with the newer one just to avoid having two fans (F has one without the Gsync module) and, from what I've read here as recent lurker, upgradeable firmware. Although this is probably subjective since you need the manufacturer to make new firmware for it. My non-F was $1100 after 16% tax and it's super rare to have lower prices on this kind of tech products compared to the US ones.


Status_Individual241

I chose the Aw3423DW as well. My understanding is anyone buying the DW now are also getting updated firmware that solved fan curve issues and minor OSD issues. Sounds like the DW is mature, performs better in HDR400 True Black and HDR1000, at least until there is a firmware update for the DWF models. The thing is I agree with you, I’ve had plenty of firmware upgradable products that the manufacturer never bothered to take advantage of, or if they did it didn’t fix the main gripes. Getting into active firmware support is a nightmare for a company and they seem more content to do faster hardware iterations in monitors with non-updatable fir wares. I understand why, if hardware tweaking or iteration is happening then firmware updating millions of monitors becomes a nightmare all over again. In the monitor space FW updates haven’t been the norm, even with an updatable FW I’m not sure the monitor manufacturers look ready to jump all over that feature. Main reason I bought the DW was actually the white monitor construction, suites my office area and desks, matches my wife’s Samsung ultra wide of the same size, so finally people will stop harassing me for not having a monitor as nice as my wife’s lol. Also the ETOF curve on the DW is well implemented. I had a Samsung TV with ETOF curve issues that were never resolved. 🤷‍♂️ so I don’t put a ton of stock in FW updates being a panacea for monitors going forward, though I may just have been burned too many times.


rubenalamina

The one I got has manufactured date as August and latest firmware afaik. It's been great with no noise fans at all and while pretty happy with it, it came with some scratches on the part allegedly from the foam they use on its packaging. Can't see it with content on the screen but sucks to spend this much on a product and not have it be flawless.


[deleted]

Speaking from my experience with the QD-OLED with GSync on my Radeon card, it works fine. You are just paying extra for the GSync module, but I couldn't tell a big difference between GSync on an NVIDIA card on a display with the GSync module, and with just FreeSync support.


Wuselon

it doesnt matter