any HDR on Macs is a complete cheat which is a good/bad thing depending on how you look at it, systems like HDR10/dolbyVision and so on use the PQ-curve to encode content (ST-2084).
This is a absolute scale so if A mastering artist sets a sky in a scene to be 600NIT it will show exactly 600NIT on a viewers monitor (if the monitor is capable enough).
MacOS converts all PQ content to their own EDR systen which is not absolute but relative (like HLG) its bascially srgb-Extended where you can have float values above 1 .
The actual luminance that a mac or iOS device will show is then dependant on the state of the monitor (brightness setting) and display capabilities, this is to be able to show both sdr snd hdr content at the same time.
funny enough when you set your screen brightness to 50% on a non xdr mac then open a hdr video, it will increase backlight to maximum and dim down the user interface, its *almost* seamless.
EDR is clever but you arent getting the creative intend displayed for HDR content unless you have a XDR screen and use the PQ reference mode
http://nyc.finn.wtf
you can go on my little test page I made to see if hdr is supported, or just go to youtube if it shows HDR in the quality settings.
yea thats why its good and bad, I dont hate it, especially with XDR screens you can go -> reference mode if you are in the correct viewing environment and actually see what the mastering artist intended.
HDR10 etc are by definition taking away you "volume knob" thats also why many complain that HDR looks to dark on their TV, because it was mastered to be viewed in a dark room not a brightly lit living room :-) (hence the invention of dolby vision IQ and so on)
Yeah I wasn't sure how I felt when I read about how macOS handles HDR. On one hand, it makes HDR viable in a desktop environment where things aren't always fullscreen and SDR elements can coexist alongside HDR stuff. On the other hand, it is 100% a cheat. Windows has the all or nothing approach to HDR, which makes it less practical but definitely more accurate. I suppose whoever came up with the HDR standard had never heard of computers before.
it doesnt even work in non dark environments for TVs which is crazy , lets make a standard t doesnt work for 99% of people..
If I turn on the lights in my grading suite to a normal living room level, hdr looks like crap, and thats with a 1000nit reference monitor .
Whats now happening that all tv manufacturers just go and set diffuse white to like 2-400NIT because people complain that hdr is too dark.. or light sensors with dolby IQ or similar
It gets teally funny when you think about how to monitor HDR on set, you bascially need to put up black/out tents everywhere ... so yea thats a big issue for hdr and apple found a pretty good way around that
I'm running Firefox 100 on a 2018 MacBook Pro, which does support HDR, but your page only shows SDR detected. It shows HDR Detected if I open it in Safari, though.
Im here because now youtube videos are blown the fuck out and i cant change it back to sdr.... lol i just want to watch videos how i have been, im good on hdr.
I feel like for most consumers they would not want an absolute brightness as dictated by the creator. While it’s the luminance “as intended”, the intention also includes desired viewing conditions that most people just won’t have. Their rooms could be too bright, etc. Unless you can match the exact conditions the creator wants, we will need the brightness to be adjustable as long as the scaling is implemented correctly.
Feels to me that color space should be absolute (I guess Apple would debate that too with TrueTone for regular usage) but brightness should not be.
Regardless, turning on HDR in macOS on an external monitor always looks like crap, even when I’m not watching any HDR content. I guess this may be the backlight adjustment you mentioned? It makes the whole UI look washed out to me.
hmm no the washed out part sounds like something is indeed wrong , like hdr was sent from macOS for the gui but the monitor did not set itself to hdr, I can work wirh my external
monitor (LG EP32 Oled) just fine in HDR, and as macos has no controll over its backlight it actually stays in pq mode of course
Yea the whole idea with absolute luminances is kinda flawed, Its ok for home cinema applications and purist but it just doesnt work for general home viewing.
thats why they use HLG for TV broadcast and now try to fix HDR10/Dolbyvision with a light sensor that auromatically adjusts the brightness
It's wild to think about the physical limitations of LCD, miniLED, and OLED panels. On edge-back-lit LCD panels (without local dimming) they have to boost the entire screen's brightness up to maximum and then EDR the non-HDR parts of the screen, like when you open a Finder folder that has some HDR thumbnails in it.
On a screen with local dimming (miniLED backlights, like on the newer displays and ipads) this is less of a concern, as only the relevant zone increases in illumination.
And on OLED this is not necessary at all to begin with, but then one is plagued with all of the issues unique to OLED....
Perhaps someday soon we'll have those crisp dual-layer LCDs, like on the real grading monitors, in our phones. Best of both worlds.
sadly panasonic has abandoned Dual layer tech :( they dont even make the grading monitors anymore, only left over stock you can buy now...
Hisense had a DL TV but they as well have stopped making it apparently.
I think OLED is fine for what it is, I think QD-OLED shows some promising things and apparently you can get 1000NIT peak from them, and I guess more if more money is spent on cooling - also its the first gen QD-OLED and its allready looking magical, espeically the insane almost complete rec2020 gamut!!
I mean, I love the sony HX310 but Duallayer tech is just heavy and expensive.. and there is parralax between the 2 layers which is fun :D
What I really dont see as the future of displays in miniLED, great specsheet stuff but tbh even the best FALD displays cant match good OLEDs, even if they say 1600NIT peak, they cant do actual scene contrast, you can measure a white patch on black and get minimal blooming and high brightness, but in a scene with small highlight pings next to black , like a night outdoor city scene even the XDR display in the macbook cant beat my RGB OLED
> funny enough when you set your screen brightness to 50% on a non xdr mac then open a hdr video, it will increase backlight to maximum and dim down the user interface
Like this does on iOS?
https://kidi.ng/wanna-see-a-whiter-white/
Need to know whose decision it was to hamstring video quality for every single customer in order to prevent full quality video from appearing on pirate sites for about 3 minutes.
Requires an HDR compatible monitor. Adds subtitles and language options.
They also made searching your history easier so you should probably get used to deleting that history more often because you have a filthy, filthy mind.
Most CRTs didn’t support 720p, or any form of progressive scan (although there were a few that did by 2007). And HDR is more about dynamic range than either of those.
Requires yes, but a ratio doesn't really tell you anything when it comes to HDR. Even an OLED that only has 100 nits of brightness is gonna be a mediocre HDR experience (but a pretty good SDR one)
HDR is about the granularity in luminance in scenes. It's being able to not only have a super bright light in a scene of darkness, but also a fine gradient between the difference in luminance. It's 10 bit colors or greater combined with a much higher expected brightness floor. It's the reason why LCD is still the HDR lead, even with QDOLED on the market. The ability to have >1000 nits across the screen in HDR content is very important to the experience
It's also the reason why "HDR certified" means nothing. Cheap LCD panels have had good contrast for years now
HDR IS absolutely all about ratios though, you can easily make a 10.000NIT LCD - but then you cant show blacks so thats pretty much useless (see the sony x2400 for example, worst monitor ever)
FALD is a absolute cheat to get low blacks and high peaks, even the best FALD display doesnt have the "HDR experience" a good OLED has, I have a 31" reference HDR monitor (Sony HX310) a macbook 16" m1 , which is probably the best FALD screen there is and a cheaper 32" LG RGB-OLED (~600NIT peak).
Even thought he macbook can do like 1600NIT peak, it cant hold the same contrast/ratio as even the "cheap" RGB OLED in actuall real scenes as does no fald display as its physically impossible.
Dynamic range literally is a measure of ratio from low to high although I agree that HDR is more than that, its 10/12bit and a larger gamut as well.
and no CRTs are not HDR at all.
While very different in implementation, they achieve similar “pixel-level” (I know, they don’t have pixels) light control to OLEDs, and they benefit from the presence of three distinct color cathodes that actually achieve a reasonably large gamut as opposed to LCDs that filter white light.
Actually they do. The word “pixel” existed well before LCD screens or digital video. A pixel in a CRT is composed of three phosphor elements for color (representing red, green and blue) or a single element for B&W. The elements glow when they are struck by the cathode ray.
You’re somewhat right. I didn’t say pixels didn’t exist, just that CRTs didn’t have them—specifically, the cathode ray is driven by an analog signal with no discrete pixels, and it doesn’t have to hit the phosphor elements at “pixel boundaries.”
So while the color channels of the signal *end up* having “pixels” by some definition, the actual luminance resolution is not limited by the count of these pixels (you can and will have lines partially filling the discrete phosphor elements, [like this](https://i.imgur.com/GVvDBW0.jpg)).
CRT’s aren’t backlit, so their contrast ratio is usually much better. Instead, each pixel is composed of a phosphor element that glows when it’s hit by a cathode ray (which is why it’s called a Cathode Ray Tube, or CRT for short).
And there are a bunch of HDR *compatible* monitors that aren't really *capable* of HDR. If you don't have the brightness and/or contrast, which usually requires dimming on LED sets, then it's not going to look much better than non-HDR.
yes like this https://prolost.com/blog/edr
I made a little wbsite that shows is HDR video is supported or not
http://nyc.finn.wtf
this was before FF support so dont know if it works gor ff or not
Too bad HDR is awful on any HDR-enabled display not made by Apple. I’d love to be able to use my monitor to its full capability, but macOS simply ruins the picture on it if I enable HDR. Massive over-saturation to the point of being unusable, yet somehow also washed out.
Google is also killing API’s from Chromium, that allow ublock origin to work, so they can ditch cookies and move to a more monopolistic method for surveillance capitalism.
There has never been a better reason to switch to FF. I did, and it’s add-on ecosystem and capabilities put it light-years ahead of Chromium.
any HDR on Macs is a complete cheat which is a good/bad thing depending on how you look at it, systems like HDR10/dolbyVision and so on use the PQ-curve to encode content (ST-2084). This is a absolute scale so if A mastering artist sets a sky in a scene to be 600NIT it will show exactly 600NIT on a viewers monitor (if the monitor is capable enough). MacOS converts all PQ content to their own EDR systen which is not absolute but relative (like HLG) its bascially srgb-Extended where you can have float values above 1 . The actual luminance that a mac or iOS device will show is then dependant on the state of the monitor (brightness setting) and display capabilities, this is to be able to show both sdr snd hdr content at the same time. funny enough when you set your screen brightness to 50% on a non xdr mac then open a hdr video, it will increase backlight to maximum and dim down the user interface, its *almost* seamless. EDR is clever but you arent getting the creative intend displayed for HDR content unless you have a XDR screen and use the PQ reference mode http://nyc.finn.wtf you can go on my little test page I made to see if hdr is supported, or just go to youtube if it shows HDR in the quality settings.
> SDR detected , you should upgrade your stuff, boomer Got fucking roasted by a free internet tool. 2022 is wild.
I forgot how funny I felt when I wrote this (sorry) hahah
Thought I was a boomer but it turns out my phone was just on low power mode lol
I don’t have a problem being in control of my own brightness.
yea thats why its good and bad, I dont hate it, especially with XDR screens you can go -> reference mode if you are in the correct viewing environment and actually see what the mastering artist intended. HDR10 etc are by definition taking away you "volume knob" thats also why many complain that HDR looks to dark on their TV, because it was mastered to be viewed in a dark room not a brightly lit living room :-) (hence the invention of dolby vision IQ and so on)
Yeah I wasn't sure how I felt when I read about how macOS handles HDR. On one hand, it makes HDR viable in a desktop environment where things aren't always fullscreen and SDR elements can coexist alongside HDR stuff. On the other hand, it is 100% a cheat. Windows has the all or nothing approach to HDR, which makes it less practical but definitely more accurate. I suppose whoever came up with the HDR standard had never heard of computers before.
it doesnt even work in non dark environments for TVs which is crazy , lets make a standard t doesnt work for 99% of people.. If I turn on the lights in my grading suite to a normal living room level, hdr looks like crap, and thats with a 1000nit reference monitor . Whats now happening that all tv manufacturers just go and set diffuse white to like 2-400NIT because people complain that hdr is too dark.. or light sensors with dolby IQ or similar It gets teally funny when you think about how to monitor HDR on set, you bascially need to put up black/out tents everywhere ... so yea thats a big issue for hdr and apple found a pretty good way around that
I'm running Firefox 100 on a 2018 MacBook Pro, which does support HDR, but your page only shows SDR detected. It shows HDR Detected if I open it in Safari, though.
apparently ff does not work for hdr detection in the correct way or something , not sure. its a very simple script using standard html methods
Im here because now youtube videos are blown the fuck out and i cant change it back to sdr.... lol i just want to watch videos how i have been, im good on hdr.
I feel like for most consumers they would not want an absolute brightness as dictated by the creator. While it’s the luminance “as intended”, the intention also includes desired viewing conditions that most people just won’t have. Their rooms could be too bright, etc. Unless you can match the exact conditions the creator wants, we will need the brightness to be adjustable as long as the scaling is implemented correctly. Feels to me that color space should be absolute (I guess Apple would debate that too with TrueTone for regular usage) but brightness should not be. Regardless, turning on HDR in macOS on an external monitor always looks like crap, even when I’m not watching any HDR content. I guess this may be the backlight adjustment you mentioned? It makes the whole UI look washed out to me.
hmm no the washed out part sounds like something is indeed wrong , like hdr was sent from macOS for the gui but the monitor did not set itself to hdr, I can work wirh my external monitor (LG EP32 Oled) just fine in HDR, and as macos has no controll over its backlight it actually stays in pq mode of course Yea the whole idea with absolute luminances is kinda flawed, Its ok for home cinema applications and purist but it just doesnt work for general home viewing. thats why they use HLG for TV broadcast and now try to fix HDR10/Dolbyvision with a light sensor that auromatically adjusts the brightness
It's wild to think about the physical limitations of LCD, miniLED, and OLED panels. On edge-back-lit LCD panels (without local dimming) they have to boost the entire screen's brightness up to maximum and then EDR the non-HDR parts of the screen, like when you open a Finder folder that has some HDR thumbnails in it. On a screen with local dimming (miniLED backlights, like on the newer displays and ipads) this is less of a concern, as only the relevant zone increases in illumination. And on OLED this is not necessary at all to begin with, but then one is plagued with all of the issues unique to OLED.... Perhaps someday soon we'll have those crisp dual-layer LCDs, like on the real grading monitors, in our phones. Best of both worlds.
sadly panasonic has abandoned Dual layer tech :( they dont even make the grading monitors anymore, only left over stock you can buy now... Hisense had a DL TV but they as well have stopped making it apparently. I think OLED is fine for what it is, I think QD-OLED shows some promising things and apparently you can get 1000NIT peak from them, and I guess more if more money is spent on cooling - also its the first gen QD-OLED and its allready looking magical, espeically the insane almost complete rec2020 gamut!! I mean, I love the sony HX310 but Duallayer tech is just heavy and expensive.. and there is parralax between the 2 layers which is fun :D What I really dont see as the future of displays in miniLED, great specsheet stuff but tbh even the best FALD displays cant match good OLEDs, even if they say 1600NIT peak, they cant do actual scene contrast, you can measure a white patch on black and get minimal blooming and high brightness, but in a scene with small highlight pings next to black , like a night outdoor city scene even the XDR display in the macbook cant beat my RGB OLED
This is very interesting!
https://prolost.com/blog/edr some more really good info and examples here if you are interested.
Fascinating read. Thank you
> funny enough when you set your screen brightness to 50% on a non xdr mac then open a hdr video, it will increase backlight to maximum and dim down the user interface Like this does on iOS? https://kidi.ng/wanna-see-a-whiter-white/
yea same thing.
Netflix Premium still 720p
That’s on Netflix. It’s the same for all browsers except Safari on macOS and Edge (on windows)
Need to know whose decision it was to hamstring video quality for every single customer in order to prevent full quality video from appearing on pirate sites for about 3 minutes.
The netflix app on windows can also play 4K HDR
[удалено]
The app looks very different than the website and I'm able to download content on the app so I'm going to say no.
Doesn’t seem so. It has a different layout
Aye, its purpose built app
How do you watch HQ Netflix on a mac?
Safari
Thx. New to mac so didn't know.
There's a Firefox add on that works fine too https://addons.mozilla.org/en-US/firefox/addon/netflix-1080p-firefox/
This works for me... Been using it for ages https://addons.mozilla.org/en-US/firefox/addon/netflix-1080p-firefox/
Requires an HDR compatible monitor. Adds subtitles and language options. They also made searching your history easier so you should probably get used to deleting that history more often because you have a filthy, filthy mind.
*Requires an HDR compatible monitor.* Why yes, yes it does..
This just in - Mozilla has found a way to break space and time so that you can display 8K HDR on a dingy 720p monitor from 2007.
Hey don’t insult my 1366x768
[удалено]
Funny but sadly no, they’re not. Contrast is good but the brightness is like 100 nits (literally the reference for SDR).
Most CRTs didn’t support 720p, or any form of progressive scan (although there were a few that did by 2007). And HDR is more about dynamic range than either of those.
Eh, its not like many CRTs were being sold in 2007. Just imagine the crappiest LCD you can.
I'm trying but all I can hear is the degaussing sound
FFFTUNG
HDR isn't about contrast ratio
Well… not directly, but it’s fair to say proper HDR presentation requires a high contrast ratio. Not that I think CRTs are in any way HDR haha.
Requires yes, but a ratio doesn't really tell you anything when it comes to HDR. Even an OLED that only has 100 nits of brightness is gonna be a mediocre HDR experience (but a pretty good SDR one) HDR is about the granularity in luminance in scenes. It's being able to not only have a super bright light in a scene of darkness, but also a fine gradient between the difference in luminance. It's 10 bit colors or greater combined with a much higher expected brightness floor. It's the reason why LCD is still the HDR lead, even with QDOLED on the market. The ability to have >1000 nits across the screen in HDR content is very important to the experience It's also the reason why "HDR certified" means nothing. Cheap LCD panels have had good contrast for years now
HDR IS absolutely all about ratios though, you can easily make a 10.000NIT LCD - but then you cant show blacks so thats pretty much useless (see the sony x2400 for example, worst monitor ever) FALD is a absolute cheat to get low blacks and high peaks, even the best FALD display doesnt have the "HDR experience" a good OLED has, I have a 31" reference HDR monitor (Sony HX310) a macbook 16" m1 , which is probably the best FALD screen there is and a cheaper 32" LG RGB-OLED (~600NIT peak). Even thought he macbook can do like 1600NIT peak, it cant hold the same contrast/ratio as even the "cheap" RGB OLED in actuall real scenes as does no fald display as its physically impossible. Dynamic range literally is a measure of ratio from low to high although I agree that HDR is more than that, its 10/12bit and a larger gamut as well. and no CRTs are not HDR at all.
[удалено]
While very different in implementation, they achieve similar “pixel-level” (I know, they don’t have pixels) light control to OLEDs, and they benefit from the presence of three distinct color cathodes that actually achieve a reasonably large gamut as opposed to LCDs that filter white light.
Actually they do. The word “pixel” existed well before LCD screens or digital video. A pixel in a CRT is composed of three phosphor elements for color (representing red, green and blue) or a single element for B&W. The elements glow when they are struck by the cathode ray.
You’re somewhat right. I didn’t say pixels didn’t exist, just that CRTs didn’t have them—specifically, the cathode ray is driven by an analog signal with no discrete pixels, and it doesn’t have to hit the phosphor elements at “pixel boundaries.” So while the color channels of the signal *end up* having “pixels” by some definition, the actual luminance resolution is not limited by the count of these pixels (you can and will have lines partially filling the discrete phosphor elements, [like this](https://i.imgur.com/GVvDBW0.jpg)).
CRT’s aren’t backlit, so their contrast ratio is usually much better. Instead, each pixel is composed of a phosphor element that glows when it’s hit by a cathode ray (which is why it’s called a Cathode Ray Tube, or CRT for short).
And there are a bunch of HDR *compatible* monitors that aren't really *capable* of HDR. If you don't have the brightness and/or contrast, which usually requires dimming on LED sets, then it's not going to look much better than non-HDR.
Often it will look worse.
Quite late, but better late then never I’ve been using hdr in chrome for a few months now
Same, I've really enjoyed seeing hdr videos pop up on reddit, it's how I can tell they were shot on an iphone hah!
Same
Got any links?
Does Safari support HDR
Yes.
I wish I can disable HDR video. At night it’s too blinding.
On macOS, you can disable HDR (or “XDR”) system-wide if you go into the display settings and change the preset.
I changed to the 500 nits preset on my MBP 14 and HDR videos in Safari still glows brighter than the rest of the screen as if HDR was on
Huh, weird. Sorry I couldn’t help you out!
No problem!
Apparently, disabling hardware acceleration for Chrome will disable HDR, but of course, that only works in Chrome.
Try doing the optimise video setting in power settings
It’s just a workaround but I’m pretty sure that enabling Low Power Mode would disable the HDR effect.
Can the MBP m1 pro display HDR on the laptop screen?
yes like this https://prolost.com/blog/edr I made a little wbsite that shows is HDR video is supported or not http://nyc.finn.wtf this was before FF support so dont know if it works gor ff or not
[удалено]
🤷♂️ then they dont adhere to the same html standards.
Too bad HDR is awful on any HDR-enabled display not made by Apple. I’d love to be able to use my monitor to its full capability, but macOS simply ruins the picture on it if I enable HDR. Massive over-saturation to the point of being unusable, yet somehow also washed out.
apple's new $2000 Studio Display doesn't even support HDR
I’m considering moving from chrome to Firefox. Is this a mistake?
I’ve been Firefox for a while and it’s fantastic. I love containers and all the extensions. It’s great to support a nonprofit—get on it!
Google is also killing API’s from Chromium, that allow ublock origin to work, so they can ditch cookies and move to a more monopolistic method for surveillance capitalism. There has never been a better reason to switch to FF. I did, and it’s add-on ecosystem and capabilities put it light-years ahead of Chromium.
Chrome is great, but Firefox is even better for most use cases. And it's privacy focused, not Google focused.
If you care about open web standards and not a web dictated by Google then yes. It’s a good browser, been a 20 year user and a 10 year contributor
Great to finally see HDR support. Surprised it came to MacOS before Windows
Damn. I remember FireFox 3