Posts: 1,066
Joined: Oct 2011
Reputation:
27
Soli
Posting Freak
Posts: 1,066
2016-04-30, 03:54
(This post was last modified: 2016-04-30, 04:02 by Soli.)
In laymans terms (or the best I can do without going all techie) Displays in HDR mode use a different gamma curve, so the relative bitlevels are much slower going from dark to bright. That's why a HDR picture will look too bright on a normal display. When using a display in HDR mode with the darker gamma curve, it will look right.
So, to convert a HDR picture to SDR, you need to do the opposite:adjust the picture to be darker so it will look right on a normal SDR display. But you leave out the HDR part of the image, or it wont "fit". I am not sure but I suspect different manufacturers have slightly different methods to convert HDR to SDR as I think there is no definitive standard to this, and different manufacturers might want to be clever and keep parts of HDR when downconverting. (Again, I haven't read up on the specifics although it's pretty straightforward mathemathically) Let's call this the secret sauce. The sauce is used when downconverting to SDR. Another sauce is in place when the display switch over to HDR mode.
HDR is still a mess , and the HDR metadata are dynamic; describing both max luminence and also the color gamut. So the display has to somehow switch gamma/modes/gamut on the fly. A very special sauce indeed, and for the time being pretty much impossible to calibrate properly (although I have a rough idea on how they are going to solve this)