[split] TV HDR mode
#1
(2019-10-17, 17:41)lightsout Wrote:  

Hi, I do video editing: I always made 4K 8 bit video and I watch them with Shield TV and Sony TV KD-65X8509C. So with Shield TV I always used the HDMI 3 "Standard". Today I created a 4K 10bit file for the first time and then I chose the Advanced option on HDMI 3 (I don't remember the precise writing), but it is the highest quality 4K 10bit, etc., then the TV restarted. Anyway I changed settings in Shield TH and I chose "YUV420 10bit Rec.709".
As soon as I opened the Shiel TV interface (where there are many icons) and then even Kodi to be able to view that video, both show very languid colors. With Kodi I watch all my 4K videos, but I was surprised that Kodi interface has changed color. It is now an incredible colour problem, the colors were missing. The video was also very, very faded, I had to increase the color a lot, before the sliedr was on 54 and now I had to move it to 80. Why do you see so faded coluous after choosing Advanced option on HDMI 3? There is something wrong? Do you get this problem you too? Do I have to chjange settings on Shield TV?

I thank you for making me understand.
Reply
#2
I assume you mean you've enabled the TV HDR mode. At a guess as this is the first time you're using HDR you'd need to calibrate the TV picture settings again.
Reply
#3
(2020-05-07, 08:37)FoxADRIANO Wrote:
(2019-10-17, 17:41)lightsout Wrote:  

Hi, I do video editing: I always made 4K 8 bit video and I watch them with Shield TV and Sony TV KD-65X8509C. So with Shield TV I always used the HDMI 3 "Standard". Today I created a 4K 10bit file for the first time and then I chose the Advanced option on HDMI 3 (I don't remember the precise writing), but it is the highest quality 4K 10bit, etc., then the TV restarted. Anyway I changed settings in Shield TH and I chose "YUV420 10bit Rec.709".
As soon as I opened the Shiel TV interface (where there are many icons) and then even Kodi to be able to view that video, both show very languid colors. With Kodi I watch all my 4K videos, but I was surprised that Kodi interface has changed color. It is now an incredible colour problem, the colors were missing. The video was also very, very faded, I had to increase the color a lot, before the sliedr was on 54 and now I had to move it to 80. Why do you see so faded coluous after choosing Advanced option on HDMI 3? There is something wrong? Do you get this problem you too? Do I have to chjange settings on Shield TV?

I thank you for making me understand.       
"Today I created a 4K 10bit file for the first time "

Is that 10bit HDR or 10bit SDR?  

Is it Rec 2020 or Rec 709 gamut ?

If it is HDR - is it HLG (which the Shield TV doesn't handle natively) or HDR10/PQ (aka ST.2084) which the Shield TV can output natively.  (The Shield TV can output Rec 2020 HLG flagged as Rec 2020 SDR - and if your TV allows it you can then manually put it into HLG mode to get a decent HLG result).  However HLG displayed in SDR would not look desaturated, or have lifted blacks (as it's SDR compatible for about 75% of its range)

If it's HDR and it is HDR10/PQ (aka ST.2084) you need to ensure that you have an HDR10 route all the way to your TV.  If not you will have lifted blacks (blacks that appear grey), and dull whites (they will be dim), this will also desaturate your colours.  If you have Rec 2020 colour being displayed as Rec 709 then again you will have distorted colour because the primaries are different AND the YCrCb to RGB matrix is different.  

I believe your Sony TV is like mine - and lets you run in Auto HDR mode or manually force HDR10 or HLG display mode for sources that aren't correctly flagged (very useful for HLG content output flagged as SDR but left in HLG EOTF), or disable HDR entirely forcing all sources to be displayed in SDR (which will look terrible for HDR10/PQ/ST.2084 but may look reasonably OK for a lot of HLG due to HLG's backwards compatibility method), and also lets you select between Rec 709 and Rec 2020 colour gamut. This is a per-HDMI input selection.

What does MediaInfo say about the file you are playing in Kodi on your Shield TV?

Could you post the Text output from MediaInfo (it's a separate Windows/Mac/Linux application to tell you pretty much everything about a file) - and 'text' is one of the output formats for Media Info?

It sounds like you have an HDR vs SDR issue somewhere, but without knowing the format of the file you are playing it's difficult to tell for sure.

(Enabling the 'Advanced HDMI' option on a Sony TV's HDMI input simply allows the TV to accept, and flag that it accepts, the higher quality and higher bandwidth HDMI 2.0 modes that support 4:2:2 12-bit at 2160p50/60 - allowing for high quality 4K at 50fps and 60fps with >8-bit depth, and thus also allows for HDR at 2160p50 and 60 with 4:2:2 chroma)
Reply
#4
(2020-05-07, 09:51)Hitcher Wrote: I assume you mean you've enabled the TV HDR mode. At a guess as this is the first time you're using HDR you'd need to calibrate the TV picture settings again.

Waooo, you are right, I'm sorry for my inexperience. I don't want activate HDR because my videos aren't in HDR. Therefore do you suggest to use HDM3 "Standard" option, whether my videos are in 8bit UHD and whether my videos are 10bit?
Thanks for your suggestion.
Reply
#5
(2020-05-07, 12:22)FoxADRIANO Wrote:
(2020-05-07, 09:51)Hitcher Wrote: I assume you mean you've enabled the TV HDR mode. At a guess as this is the first time you're using HDR you'd need to calibrate the TV picture settings again.

Waooo, you are right, I'm sorry for my inexperience. I don't want activate HDR because my videos aren't in HDR. Therefore do you suggest to use HDM3 "Standard" option, whether my videos are in 8bit UHD and whether my videos are 10bit?
Thanks for your suggestion.      

If your video is NOT HDR, then your TV shouldn't be in an HDR mode - so your SDR settings should still apply?  The advice from Hitcher was for if your video WAS HDR.  Very few people use 10-bit video for SDR (as there isn't much/any commercial SDR 10-bit content - as SDR is usually 8-bit).  Therefore a lot of people assume 10-bit=HDR - when it doesn't mean that at all.  (I have a lot of 10-bit SDR HD HEVC/h.265 stuff that I mastered myself from 10-bit AVCi 100Mbs 4:2:2 masters)

If a video file doesn't have proper metadata within it - flagging SDR, HDR PQ, HDR HLG etc. then some player software may 'guess' I suspect - and assume...

Can you post the video information from MediaInfo about the file you are playing.

It will look something like this : 
Video
ID                                       : 1
Format                                   : HEVC
Format/Info                              : High Efficiency Video Coding
Format profile                           : Main [email protected]@high
HDR format                               : SMPTE ST 2086, HDR10 compatible
Codec ID                                 : hev1
Codec ID/Info                            : High Efficiency Video Coding
Bit rate                                 : 60.0 Mbps
Width                                    : 3 840 pixels
Height                                   : 2 160 pixels
Display aspect ratio                     : 16:9
Frame rate mode                          : Variable
Frame rate                               : 59.940 (60000/1001) fps
Minimum frame rate                       : 59.920 fps
Maximum frame rate                       : 59.960 fps
Color space                              : YUV
Chroma subsampling                       : 4:2:0
Bit depth                                : 10 bits
Bits/(Pixel*Frame)                       : 0.121
Color range                              : Limited
Color primaries                          : BT.2020
Transfer characteristics                 : PQ
Matrix coefficients                      : BT.2020 non-constant
Mastering display color primaries        : Display P3
Mastering display luminance              : min: 0.0040 cd/m2, max: 1100 cd/m2

Codec configuration box                  : hvcC


The bold bits tell you this file is Rec 2020 gamut and PQ HDR10.

Whereas :

Video
ID                                       : 33 (0x21)
Menu ID                                  : 301 (0x12D)
Format                                   : HEVC
Format/Info                              : High Efficiency Video Coding
Format profile                           : Main [email protected]@main
Codec ID                                 : 36
Width                                    : 3 840 pixels
Height                                   : 2 160 pixels
Display aspect ratio                     : 16:9
Frame rate                               : 59.940 (60000/1001) fps
Color space                              : YUV
Chroma subsampling                       : 4:2:0 (Type 2)
Bit depth                                : 10 bits
Color range                              : Limited
Color primaries                          : BT.2020
Transfer characteristics                 : HLG / BT.2020 (10-bit)
Matrix coefficients                      : BT.2020 non-constant


Whereas the bold bits here tell you it's Rec 2020 but with HLG HDR.
Reply
#6
(2020-05-07, 12:19)noggin Wrote:
(2020-05-07, 08:37)FoxADRIANO Wrote:
(2019-10-17, 17:41)lightsout Wrote:  

Hi, I do video editing: I always made 4K 8 bit video and I watch them with Shield TV and Sony TV KD-65X8509C. So with Shield TV I always used the HDMI 3 "Standard". Today I created a 4K 10bit file for the first time and then I chose the Advanced option on HDMI 3 (I don't remember the precise writing), but it is the highest quality 4K 10bit, etc., then the TV restarted. Anyway I changed settings in Shield TH and I chose "YUV420 10bit Rec.709".
As soon as I opened the Shiel TV interface (where there are many icons) and then even Kodi to be able to view that video, both show very languid colors. With Kodi I watch all my 4K videos, but I was surprised that Kodi interface has changed color. It is now an incredible colour problem, the colors were missing. The video was also very, very faded, I had to increase the color a lot, before the sliedr was on 54 and now I had to move it to 80. Why do you see so faded coluous after choosing Advanced option on HDMI 3? There is something wrong? Do you get this problem you too? Do I have to chjange settings on Shield TV?

I thank you for making me understand.      
"Today I created a 4K 10bit file for the first time "

Is that 10bit HDR or 10bit SDR?  

Is it Rec 2020 or Rec 709 gamut ?

If it is HDR - is it HLG (which the Shield TV doesn't handle natively) or HDR10/PQ (aka ST.2084) which the Shield TV can output natively.  (The Shield TV can output Rec 2020 HLG flagged as Rec 2020 SDR - and if your TV allows it you can then manually put it into HLG mode to get a decent HLG result).  However HLG displayed in SDR would not look desaturated, or have lifted blacks (as it's SDR compatible for about 75% of its range)

If it's HDR and it is HDR10/PQ (aka ST.2084) you need to ensure that you have an HDR10 route all the way to your TV.  If not you will have lifted blacks (blacks that appear grey), and dull whites (they will be dim), this will also desaturate your colours.  If you have Rec 2020 colour being displayed as Rec 709 then again you will have distorted colour because the primaries are different AND the YCrCb to RGB matrix is different.  

I believe your Sony TV is like mine - and lets you run in Auto HDR mode or manually force HDR10 or HLG display mode for sources that aren't correctly flagged (very useful for HLG content output flagged as SDR but left in HLG EOTF), or disable HDR entirely forcing all sources to be displayed in SDR (which will look terrible for HDR10/PQ/ST.2084 but may look reasonably OK for a lot of HLG due to HLG's backwards compatibility method), and also lets you select between Rec 709 and Rec 2020 colour gamut. This is a per-HDMI input selection.

What does MediaInfo say about the file you are playing in Kodi on your Shield TV?

Could you post the Text output from MediaInfo (it's a separate Windows/Mac/Linux application to tell you pretty much everything about a file) - and 'text' is one of the output formats for Media Info? 
My file isn't HDR and I don't want to use HDR with my Lumix GH5. Asusual I have created confusion. I liked to make a test. I don't have 10bit files yet, therefore I added a 8bit file to Davinci Resolve and exported to 10bit. But MediaInfo read rightly it 8bit.
Sorry to create confusion.
I'm struggling and climbing on the mirrors because I still can't watch my UHD files well with Shield TV and Sony Bravia TV. I understand I need to use HDMI 3 with "Standard" option only as with 8bit files as with 10bit files without HDR, because the "Advanced" option on HDMI 3 enables HDR and I don't like to use HDR with my Lumix GH5.
Anyway ... if I remember fine, my Sony TV has HDR and it is always active and it is impossible can't be deactivated it. I think I have a presious model TV than yours.
Out of curiosity: what is a good bitrate to get a good and real 10bit file exporting from Davinci Resolve?
I attach the MediaInfo file:
https://www.dropbox.com/s/xtfvsnfnh0h5jh...o.txt?dl=0
Reply
#7
(2020-05-07, 12:41)noggin Wrote: Very few people use 10-bit video for SDR (as there isn't much/any commercial SDR 10-bit content - as SDR is usually 8-bit).  Therefore a lot of people assume 10-bit=HDR - when it doesn't mean that at all.  (I have a lot of 10-bit SDR HD HEVC/h.265 stuff that I mastered myself from 10-bit AVCi 100Mbs 4:2:2 masters)

I'n not sure if my Sony TV KD-65X8509C has HDR ot not HDR. On Internet I read all and the contrary of all. Wink But I think it has HDR.
Anyway it seems to understand you are suggesting me to continue to shoot in 8bit 50p even if my camera allows to shoot 10bit 25p. Wink
Reply
#8
(2020-05-07, 14:23)FoxADRIANO Wrote:
(2020-05-07, 12:41)noggin Wrote: Very few people use 10-bit video for SDR (as there isn't much/any commercial SDR 10-bit content - as SDR is usually 8-bit).  Therefore a lot of people assume 10-bit=HDR - when it doesn't mean that at all.  (I have a lot of 10-bit SDR HD HEVC/h.265 stuff that I mastered myself from 10-bit AVCi 100Mbs 4:2:2 masters)

I'n not sure if my Sony TV KD-65X8509C has HDR ot not HDR. On Internet I read all and the contrary of all. Wink But I think it has HDR.
Anyway it seems to understand you are suggesting me to continue to shoot in 8bit 50p even if my camera allows to shoot 10bit 25p. Wink 

It has, I got the same one Smile
LG 77G1 • Onkyo TX-RZ70 • Q Acoustics F:2050i C:2000Ci R:2020i A:QI65C • 2x BK Elec XXLS400-DF
Vero V (OSMC) • Nvidia Shield TV Pro (2019) • Sony UBP-X800M2 • Sony PlayStation 5
Reply
#9
(2020-05-07, 14:39)Theetjuh Wrote:
(2020-05-07, 14:23)FoxADRIANO Wrote:
(2020-05-07, 12:41)noggin Wrote: Very few people use 10-bit video for SDR (as there isn't much/any commercial SDR 10-bit content - as SDR is usually 8-bit).  Therefore a lot of people assume 10-bit=HDR - when it doesn't mean that at all.  (I have a lot of 10-bit SDR HD HEVC/h.265 stuff that I mastered myself from 10-bit AVCi 100Mbs 4:2:2 masters)

I'n not sure if my Sony TV KD-65X8509C has HDR ot not HDR. On Internet I read all and the contrary of all. Wink But I think it has HDR.
Anyway it seems to understand you are suggesting me to continue to shoot in 8bit 50p even if my camera allows to shoot 10bit 25p. Wink    

It has, I got the same one Smile   
Ohhh, ok, nice to hear it. But how do you enable it? Is it possible to enable and disable it?
Reply
#10
(2020-05-07, 15:25)FoxADRIANO Wrote:
(2020-05-07, 14:39)Theetjuh Wrote:
(2020-05-07, 14:23)FoxADRIANO Wrote: I'n not sure if my Sony TV KD-65X8509C has HDR ot not HDR. On Internet I read all and the contrary of all. Wink But I think it has HDR.
Anyway it seems to understand you are suggesting me to continue to shoot in 8bit 50p even if my camera allows to shoot 10bit 25p. Wink   

It has, I got the same one Smile   
Ohhh, ok, nice to listen it. But how do you enable it? Is it possible to enable and disable it? 
It auto engages, just need to make sure you have your hdmi ports set to enhanced.
LG 77G1 • Onkyo TX-RZ70 • Q Acoustics F:2050i C:2000Ci R:2020i A:QI65C • 2x BK Elec XXLS400-DF
Vero V (OSMC) • Nvidia Shield TV Pro (2019) • Sony UBP-X800M2 • Sony PlayStation 5
Reply
#11
(2020-05-07, 15:29)Theetjuh Wrote:
(2020-05-07, 15:25)FoxADRIANO Wrote:
(2020-05-07, 14:39)Theetjuh Wrote: It has, I got the same one Smile   
Ohhh, ok, nice to listen it. But how do you enable it? Is it possible to enable and disable it?   
It auto engages, just need to make sure you have your hdmi ports set to enhanced.  
Therefore if I like to watch UHD videos as in 8bit as 10bit and as h.24 as h.265, I have to use the "Standard" HDMI port. Is it right?
Reply
#12
(2020-05-07, 14:23)FoxADRIANO Wrote:
(2020-05-07, 12:41)noggin Wrote: Very few people use 10-bit video for SDR (as there isn't much/any commercial SDR 10-bit content - as SDR is usually 8-bit).  Therefore a lot of people assume 10-bit=HDR - when it doesn't mean that at all.  (I have a lot of 10-bit SDR HD HEVC/h.265 stuff that I mastered myself from 10-bit AVCi 100Mbs 4:2:2 masters)

I'n not sure if my Sony TV KD-65X8509C has HDR ot not HDR. On Internet I read all and the contrary of all. Wink But I think it has HDR.
Anyway it seems to understand you are suggesting me to continue to shoot in 8bit 50p even if my camera allows to shoot 10bit 25p. Wink  

Not at all - what you shoot in and what you master to are two totally different things. You shoot 10-bit (and ideally Log) to give you latitude in your edit and grade, grading this down to 8-bit makes sense...
Reply
#13
(2020-05-07, 14:13)FoxADRIANO Wrote: My file isn't HDR and I don't want to use HDR with my Lumix GH5. Asusual I have created confusion. I liked to make a test. I don't have 10bit files yet, therefore I added a 8bit file to Davinci Resolve and exported to 10bit. But MediaInfo read rightly it 8bit.
Sorry to create confusion.

I'm struggling and climbing on the mirrors because I still can't watch my UHD files well with Shield TV and Sony Bravia TV. I understand I need to use HDMI 3 with "Standard" option only as with 8bit files as with 10bit files without HDR, because the "Advanced" option on HDMI 3 enables HDR and I don't like to use HDR with my Lumix GH5.

That should not be the case - enabling Advanced doesn't force your TV into HDR mode. I have had two generations of Sony HDR TVs - and neither have been forced into HDR mode when I have enabled Advanced mode.  

Your reports also suggested the opposite - that HDR material was being displayed in SDR, not SDR in HDR...
Quote:Anyway ... if I remember fine, my Sony TV has HDR and it is always active and it is impossible can't be deactivated it. I think I have a presious model TV than yours.
Out of curiosity: what is a good bitrate to get a good and real 10bit file exporting from Davinci Resolve?
I attach the MediaInfo file:
https://www.dropbox.com/s/xtfvsnfnh0h5jh...o.txt?dl=0 
ID : 1
Formato : HEVC
Formato/Informazioni : High Efficiency Video Coding
Profilo formato : [email protected]@main
ID codec : hvc1
ID codec/Informazioni : High Efficiency Video Coding
Durata : 27 min 45s
Bitrate : 31,8 Mb/s
Larghezza : 3.840 pixel
Altezza : 2.160 pixel
Rapporto aspetto visualizzazione : 16:9
Modalità frame rate : Costante
Frame rate : 50,000 FPS
Spazio colore : YUV
Croma subsampling : 4:2:0
Profondità bit : 8 bit
Tipo scansione : Progressivo
Bit/(pixel*frame) : 0.077
Dimensione della traccia : 6,17 GiB (99%)
Lingua : Inglese
Color range : Limited
Colori primari : BT.709
Coefficienti matrici : BT.709
Codec configuration box : hvcC


OK - so your file is 2160p50 8-bit HEVC Rec 709.

Can you upload a clip?  I'll play it on my Shield TV Pro into my Sony HDR TV in both modes.

You didn't apply any LUTs in Resolve did you?
Reply
#14
(2020-05-07, 18:37)FoxADRIANO Wrote:
(2020-05-07, 15:29)Theetjuh Wrote:
(2020-05-07, 15:25)FoxADRIANO Wrote: Ohhh, ok, nice to listen it. But how do you enable it? Is it possible to enable and disable it?  
It auto engages, just need to make sure you have your hdmi ports set to enhanced.  
Therefore if U like to watch UHD videos as in 8bit as 10bit and as h.24 as h.265, I have to use the "Standard" HDMI port. Is it right? 

No - if you enabled Advanced mode it allows UHD HDR videos to play in HDR, and SDR to play in SDR.  It automatically switches based on the HDMI signal. (The Shield TV correctly switches its HDMI output between SDR and HDR - though you may have to enable this)

10 bit doesn't mean HDR.
Reply
#15
>You shoot 10-bit (and ideally Log) to give you latitude in your edit and grade, grading this down to 8-bit makes sense...

You are always giving me excellent advice and your answers are perfect from the technical point of view. But perhaps too perfect for me. WinkWink .... and sometimes I don't understand well.
Could you explain better please? Do you think I can shoot 25p 10 bit and in post I can change to 50p 8bit? But this way I don't get all the framerates I want for a fluid video. But maybe I don't understand you.

Anyway I will use  the "Advanced" option of HDMI 3 to watch all my videos.

Could you tell me with how many bitrate I have to export from Resolve to get a good video h.265?

In Resolve I don't aply any lut, I use Resolve only to export h.265 files.

I attach the requested file exported with Resolve (h.265): https://www.dropbox.com/s/3nflqbtuguqbdy...5.mov?dl=0
But I attach also this clip. It is original GH5 output (not edited): https://www.dropbox.com/s/fq2yw1ug97z5a2...5.MOV?dl=0
I'm not able to modify this clip in a good way. Could you help me please? For me it is almost impossible to do a good grading.
That clip is important for me.

If I use V-Log, can I do grading with waveform only, like a normal clip? Now I use waveform to do grading on every clip.

The last question: if you were me and you had to shoot my documentaries Smile Smile , would you shoot 25p 10bit or 50p 8bit? I already know you answered this question many times, and suggested me to use V-Log and 25p 10bit. But for the moment let's forget the idea of the V-Log (because I don't know if I will be able to do a good grading with the V-Log). These days I'm trying Cinelike D (it has a good latitude, better that other GH5 profiles), it has faded colors like the V-Log but a little less and I am able to do a fair / good grading with it.
Reply

Logout Mark Read Team Forum Stats Members Help
[split] TV HDR mode0