Android nVidia Shield TV (2015 & 2017 Models) - UPDATED: May 25, 2018
(2019-01-12, 00:26)noggin Wrote:
(2019-01-11, 14:44)SparkyBoy Wrote: I have a Sony KD-65A1 4 k TV and am  now pleased that the colour switching is much better for BT.709 and BT.2020 content. 

However I am struggling with Netflix HDR content appearing much darker than 4k only content. 

For example, my daughter watches "Chilling Adventures of Sabrina". My experience so far is:

1) on Shield TV, Netflix content is HDR (4k) and the scenes are generally very dark
2) on Sky Q, Netflix app shows the exact same content as 4K but no HDR. The same scenes are much brighter. 

So we are having to watch any Netflix HDR shows on the Sky Q app as non-HDR content due to the general darkness.

For reference, I use KODI for movies and the HDR content is nice and bright and looks amazing. So I know the issue is not with the TV or with KODI. It seems to be the Netflix HDR content. 

Does anyone else experience this or have suggestions on what I can change on Shield TV?
TL;DR - we are watching SDR content brighter than Hollywood wants us to...

In general most people with HDR TVs are watching their SDR content far brighter than SDR is intended to be watched and are thus pushing SDR content into the HDR range of their displays.  This is understandable - people are making their new TVs 'nice and bright' for day-to-day viewing.

However when you watch HDR10/DV content, this shows the SDR content in the HDR signal at the level it is 'designed' to be watched at, leaving only speculars and highlights in the HDR brightness range.  This means SDR content within HDR looks a lot darker than many people watch SDR-only stuff in reality...  HDR10/DV are PQ HDR standards that map precise pixel light levels to content, unlike SDR standards.

In theory the SDR content should peak at ~100nits, and HDR10 content is showing the SDR elements of a picture at that light level.  However lots of people set their SDR viewing brightness/contrast/backlight etc. so SDR-only content peaks at 200,300+ nits - well into HDR range...

(This is why lots of us think HDR10/DV is flawed as a domestic HDR standard and HLG makes more sense)


**EDIT - thought of a better way of explaining this **

Imagine SDR video is designed to go from 0-100% with 100% being the brightest bit of the picture, and 0 being black.

We adjust our TVs so that 100% is at a brightness we like when we set brightness/contrast/black level/backlight (and in some cases contrast extension/HDR simulation etc.)

However the specifications for video that are used during production set 100% SDR video at 100nits, and that's what well adjusted displays used for post production of SDR content are calibrated to (along with the right bias/ambient light level in the production environment).  

Most people have their TVs set with 100% SDR at much higher levels than this - 200, 300, 400+ nits.  This give you an incredibly bright SDR picture.  This is, however, much brighter than the specs used in production.

Now imagined HDR video is designed to go from 0-1000%.  The SDR picture content still occupies 0-100%, but unlike SDR brighter bits of the picture aren't clipped at 100% and the 100-1000% range is available (indeed ideally is reserved for) highlights, specular detail (that would be clipped to 100% in SDR or have to be distorted to be visible)  

Well graded HDR video will keep the bulk of the picture information in the SDR 0-100% range, so this section maps pretty well (at least to around 75%) to the 0-100% SDR version.  

However with HDR10/DV HDR content, TVs follow PQ (Perceptive Quantisation) curves which absolutely link video levels to light levels. They nail pixel brightness values to light levels.  This means that the 0-100% SDR range (which carries the bulk of the picture) in HDR video is fixed at 0-100nits but the TV, with the 100-1000% areas mapped above this (ideally to 1000nits - but not all displays reach that high - and some PQ curves could go to 4000nits or 10000nits)

If you are watching SDR content mapped to 0-300nits, and then watch the same show in HDR where the SDR content (i.e. most of the image) will be mapped to 0-100nits, the show will look a lot dimmer.
 
 Appreciate the explanation. 

However, When I watch HDR movies via KODI then the brightness is fine: Dark areas are dark and bright areas really bright - overall scenes look good.

When I watch a Netflix app show in HDR on Shield TV then generally the picture is generally much darker overall so its difficult to make out details.
Reply


Messages In This Thread
Rip CDs? - by crisp waffles - 2017-03-20, 21:19
RE: nVidia Shield TV (2015 & 2017 Models) - UPDATED: May 25, 2018 - by SparkyBoy - 2019-01-12, 15:43
Logout Mark Read Team Forum Stats Members Help
nVidia Shield TV (2015 & 2017 Models) - UPDATED: May 25, 20188