(2019-01-04, 12:30)wrxtasy Wrote: (2019-01-04, 11:37)noggin Wrote: None of this is any reason for not passing HDR metadata is it? After all every UHD Blu-ray player does it when you play a UHD Blu-ray doesn't it? Passing the wrong metadata isn't that helpful is it?
I agree you need the full monty of HDR Metadata.
But it does seem some media player vendors (not UHD Bluray) are Not passing the complete source Metadata to displays.
Which players do we know are not passing this stuff through in Netflix, Amazon Prime etc.?
Or is it just Open Source players that have the issues?
Quote:How important is MaxFLL and MaxCLL then with current 4K HDR TV real world media streamer usage scenarios ?
AIUI It can be quite important when your screen performance doesn't match that of the mastering screen, as it gives you additional information that is useful in handling playback in a way that minimises the differences. (See the article I posted which details the impact it can have particularly on displays with Auto Brightness Limiting (particularly OLEDs))
Quote:Practically there has not been a howling majority of 4K HDR TV users baying for blood because it's missing. Maybe near enough is good enough ?
If their TV's displayed it - like AVRs display Dolby True HD, DTS HD Master Audio etc. - there would be more howling I'm sure...
Quote:Or maybe the sample size of users is too low this early into the HDR media player usage cycle.
I also wonder how many people have their TVs correctly set-up for HDR and SDR.
Quote:Projector owners seem to be the most vocal minority.
Probably because it makes a bigger difference to their performance and projector owners may be more across picture quality issues as they are likely to be at the higher end of Home Cinema experts, rather than 'if it plays and the colours look roughly OK I'm happy'
Quote:Quote:Also what aspect of Metadata is the HD Fury Vertex getting wrong? (The areas I know it has issues are clock rate - and that's not a function of metadata?) The 4:2:2 bit-depth it correctly reports - there is only 12-bit 4:2:2 supported at 2160p in the HDMI standard. What the signal contains beyond this (padded 8-bit or padded 10-bit) is a content issue, not a signal issue?
As detailed by @wesk05 in the Shield thread.
Which aspect of the metadata ? The stuff in that thread I can see doesn't seem to be metadata related - or have I missed something ? Non-metadata stuff like clock rate and content bit-depth, rather than HDIM signal bit-depth, I can understand. Processing 12-bit video continuously to detect 8-bit and 10-bit padding LSBs is kind of beyond the remit of the device, and it correctly reports the HDMI format as expected. (Which is always 12-bit for a compliant HDMI 2.0 2160p signal - no other bit-depth is supported as an HDMI signal)
Quote:EDIT:
Is Shield 7.2.x test Firmware now passing thru MaxFLL and MaxCLL. I know @wesk05 was previously asking for it ?
My current Shield Firmware has passed Min/Max mastering display and MaxCLL/FLL details for files that contain it, as tested with an HD Fury Vertex. When I get a chance I'll try more clips. But a clip I had with
Code:
Mastering display luminance : min: 0.0005 cd/m2, max: 1000 cd/m2
Maximum Content Light Level : 1000 cd/m2
Maximum Frame-Average Light Level : 400 cd/m2
correctly sent those values as reported by my HD Fury Vertex.
Another clip with these metadata values :
Code:
Mastering display luminance : min: 0.0005 cd/m2, max: 4000 cd/m2
(and with no Max CLL/FLL) was reported with the correct mastering values but 0/0 for Max CLL/FLL.
*** More than happy to test other clips if people want to DropBox short clips ***