What is the CHEAPEST option for 4k hevc playback?
#16
I hear, Apple TV 4K is wrong with MaxFALL, MaxCLL HDR metadata too: https://firecore.com/comment/90894#comment-90894
What about Zidoo X9S/Z9S?
Is there a media player that is perfect for HDR metadata?
Reply
#17
(2019-01-04, 09:18)djnice Wrote: I hear, Apple TV 4K is wrong with MaxFALL, MaxCLL HDR metadata too: https://firecore.com/comment/90894#comment-90894

Is there a media player that is perfect for HDR metadata?

Likely Not. While we are in reality at the birth of 4K HDR media players and standards. Mainline Linux for example really is only just beginning to introduce 10bit HDR support. Dolby too are still having fun ironing out bugs with DolbyVision. The popular brands Sony have had issues. LG too.

Quote:What about Zidoo X9S/Z9S?


Ziddo = No > see THIS thread (click)

All the 4K HDR AML - LibreELEC / CoreELEC / OSMC media players also do not pass MaxFALL, MaxCLL HDR specific metadata. Thy NVIDIA Shield is the same I believe unless it has recently been fixed.

There are even HDR Bluray titles that have No MaxFALL, MaxCLL HDR metadata

The trouble with getting all technical and ticking every box with HDR is you can be as accurate as you want .ie NVIDIA Shield converting everything perfectly from rec.709 to bt.2020, but when such data gets to the HDR display you are then at the whim of the TV manufacturer. As NVIDIA found out with a bunch of SDR / 4K HDR TV users.

You then have LG OLED's TV for example having issues with near black flashing currently and other LG workarounds have also had to be implemented for Chroma Subsampling (see Vero 4K+ most recent Dec. update)



You really have to look visually - for picture color banding, too dark a picture, washed out colors, etc. because even not every bit of technical metadata is even measured 100% correctly downstream, with Vertex HD Fury hardware for example.

Looks like Apple have gone down the conservative Metadata route, after looking at user feedback. I personally have seen very few complaints.
Same with OSMC (Vero 4K), & AMLogic HDR hardware - running recent community versions of LE or CE.

NVIDIA is finally changing to colorspace switching on the Shield like everyone else has been doing, so no more washed out rec.709 colors for affected 4K HDR TV owners in 2019.

Projector owners belong in a special category and are more demeaning due to their equiptment not being bright enough to display the full brightness levels present in source HDR content and tone mapping then being required. Seems they definitely do need MaxFALL, MaxCLL HDR metadata.

Reply
#18
(2019-01-04, 10:47)wrxtasy Wrote: The trouble with getting all technical and ticking every box with HDR is you can be as accurate as you want .ie NVIDIA Shield converting everything perfectly from rec.709 to bt.2020, but when such data gets to the HDR display you are then at the whim of the TV manufacturer. As NVIDIA found out with a bunch of SDR / 4K HDR TV users.

You then have LG OLED's TV for example having issues with near black flashing currently and other LG workarounds have also had to be implemented for Chroma Subsampling (see Vero 4K+ most recent Dec. update)


You really have to look visually - for picture color banding, too dark a picture, washed out colors, etc. because even not every bit of technical metadata is even measured 100% correctly downstream, with Vertex HD Fury hardware for example.

Looks like Apple have gone down the conservative Metadata route, after looking at user feedback. I personally have seen very few complaints.
Same with OSMC (Vero 4K), & AMLogic HDR hardware - running recent community versions of LE or CE.

NVIDIA is finally changing to colorspace switching on the Shield like everyone else has been doing, so no more washed out rec.709 colors for affected 4K HDR TV owners in 2019.

Projector owners belong in a special category and are more demeaning due to their equiptment not being bright enough to display the full brightness levels present in source HDR content and tone mapping then being required. Seems they definitely do need MaxFALL, MaxCLL HDR metadata.         
  
None of this is any reason for not passing HDR metadata is it? After all every UHD Blu-ray player does it when you play a UHD Blu-ray doesn't it?  Passing the wrong metadata isn't that helpful is it?

The HDR metadata is there for tone mapping in all displays that can't fully display the source picture's full PQ range. (Which is almost all consumer displays).  The effect will be subtle on subtle HDR content that keeps most content in the SDR portion (<100 nits) of the PQ curve and only pushes a few speculars into the HDR range, but stuff mastered to push HDR harder, and put more content into the HDR range (>100nits) will be more noticeable.   That said I still don't think PQ is the right standard for domestic viewing - and it's clear manufacturers agree as they are introducing systems to alter PQ light levels based on ambient light levels, moving away from a proscriptive absolute bit to light relationship.  Pity that HLG wasn't picked up - ambient light level control is built into that standard as it was designed for real-world TV viewing...

https://www.lightillusion.com/uhdtv.html Is really good at explaining HDR systems and the role of metadata.

All DV content being played back on a DV device (Apple TV, Amazon Fire TV Stick 4K, Chromecast Ultra etc.) will surely be passing dynamic HDR metadata (you wouldn't get DV certification otherwise)?

Is the Apple TV not passing through the correct static HDR10 metadata in Netflix and Amazon Prime?  (Looking at MrMC forums there seem to be quite a lot of issues with HDR replay?)

Also what aspect of Metadata is the HD Fury Vertex getting wrong? (The areas I know it has issues are clock rate - and that's not a function of metadata?) The 4:2:2 bit-depth it correctly reports - there is only 12-bit 4:2:2 supported at 2160p in the HDMI standard. What the signal contains beyond this (padded 8-bit or padded 10-bit) is a content issue, not a signal issue?

*** EDIT - just checked the Shield TV.  In Netflix and Amazon with HDR content both the mastering display Min and Max luminance and the Max and Frame Average light level are output on a show-by-show basis (and are different).  In Kodi on the Shield TV it seems that if this data is present it is correctly passed through on a quick check where I compare Media Info and HD Fury Vertex reported values.  I have some files where Media Info confirms that the Min/Max luminance of the mastering display is flagged, but Max and Frame Average light level isn't, and in this case the mastering display metadata is passed through (I have files with 1000, 4000 and 10000 nit peak light level mastering flagged and these all pass through) but Max/Average light level is left at 0/0 - as reported by my HD Fury Vertex ***
Reply
#19
(2019-01-04, 11:37)noggin Wrote: None of this is any reason for not passing HDR metadata is it? After all every UHD Blu-ray player does it when you play a UHD Blu-ray doesn't it?  Passing the wrong metadata isn't that helpful is it?

I agree you need the full monty of HDR Metadata.
But it does seem some media player vendors (not UHD Bluray) are Not passing the complete source Metadata to displays.
How important is MaxFLL and MaxCLL then with current 4K HDR TV real world media streamer usage scenarios ?

Practically there has not been a howling majority of 4K HDR TV users baying for blood because it's missing. Maybe near enough is good enough ?
Or maybe the sample size of users is too low this early into the HDR media player usage cycle.

Projector owners seem to be the most vocal minority.

Quote:Also what aspect of Metadata is the HD Fury Vertex getting wrong? (The areas I know it has issues are clock rate - and that's not a function of metadata?) The 4:2:2 bit-depth it correctly reports - there is only 12-bit 4:2:2 supported at 2160p in the HDMI standard. What the signal contains beyond this (padded 8-bit or padded 10-bit) is a content issue, not a signal issue?

As detailed by @wesk05 in the Shield thread.

EDIT:
Is Shield 7.2.x test Firmware now passing thru MaxFLL and MaxCLL. I know @wesk05 was previously asking for it ?

Reply
#20
(2019-01-04, 12:30)wrxtasy Wrote:
(2019-01-04, 11:37)noggin Wrote: None of this is any reason for not passing HDR metadata is it? After all every UHD Blu-ray player does it when you play a UHD Blu-ray doesn't it?  Passing the wrong metadata isn't that helpful is it?

I agree you need the full monty of HDR Metadata.
But it does seem some media player vendors (not UHD Bluray) are Not passing the complete source Metadata to displays.    
Which players do we know are not passing this stuff through in Netflix, Amazon Prime etc.?

Or is it just Open Source players that have the issues?
Quote:How important is MaxFLL and MaxCLL then with current 4K HDR TV real world media streamer usage scenarios ?
AIUI It can be quite important when your screen performance doesn't match that of the mastering screen, as it gives you additional information that is useful in handling playback in a way that minimises the differences. (See the article I posted which details the impact it can have particularly on displays with Auto Brightness Limiting (particularly OLEDs))
Quote:Practically there has not been a howling majority of 4K HDR TV users baying for blood because it's missing. Maybe near enough is good enough ?
If their TV's displayed it - like AVRs display Dolby True HD, DTS HD Master Audio etc. - there would be more howling I'm sure... Wink
Quote:Or maybe the sample size of users is too low this early into the HDR media player usage cycle.

I also wonder how many people have their TVs correctly set-up for HDR and SDR.
Quote:Projector owners seem to be the most vocal minority.

Probably because it makes a bigger difference to their performance and projector owners may be more across picture quality issues as they are likely to be at the higher end of Home Cinema experts, rather than 'if it plays and the colours look roughly OK I'm happy'
Quote:
Quote:Also what aspect of Metadata is the HD Fury Vertex getting wrong? (The areas I know it has issues are clock rate - and that's not a function of metadata?) The 4:2:2 bit-depth it correctly reports - there is only 12-bit 4:2:2 supported at 2160p in the HDMI standard. What the signal contains beyond this (padded 8-bit or padded 10-bit) is a content issue, not a signal issue?

As detailed by @wesk05 in the Shield thread.    

Which aspect of the metadata ? The stuff in that thread I can see doesn't seem to be metadata related - or have I missed something ? Non-metadata stuff like clock rate and content bit-depth, rather than HDIM signal bit-depth, I can understand. Processing 12-bit video continuously to detect 8-bit and 10-bit padding LSBs is kind of beyond the remit of the device, and it correctly reports the HDMI format as expected. (Which is always 12-bit for a compliant HDMI 2.0 2160p signal - no other bit-depth is supported as an HDMI signal)
Quote:EDIT:
Is Shield 7.2.x test Firmware now passing thru MaxFLL and MaxCLL. I know @wesk05 was previously asking for it ? 
  
My current Shield Firmware has passed Min/Max mastering display and MaxCLL/FLL details for files that contain it, as tested with an HD Fury Vertex.  When I get a chance I'll try more clips.  But a clip I had with
Code:
Mastering display luminance              : min: 0.0005 cd/m2, max: 1000 cd/m2
Maximum Content Light Level              : 1000 cd/m2
Maximum Frame-Average Light Level        : 400 cd/m2
correctly sent those values as reported by my HD Fury Vertex.

Another clip with these metadata values :
Code:
Mastering display luminance              : min: 0.0005 cd/m2, max: 4000 cd/m2
(and with no Max CLL/FLL) was reported with the correct mastering values but 0/0 for Max CLL/FLL.

*** More than happy to test other clips if people want to DropBox short clips ***
Reply
#21
How bad are the s912 devices with complex skins? I have a s905w device that plays hevc files no problem but it's slow when dealing with heavy kodi skins. Would a s912 like the Tanix TX9 Pro be a reasonable upgrade for skin performance?
Reply
#22
(2019-01-06, 06:47)daught Wrote: How bad are the s912 devices with complex skins? I have a s905w device that plays hevc files no problem but it's slow when dealing with heavy kodi skins.
Would a s912 like the Tanix TX9 Pro be a reasonable upgrade for skin performance?

A bit, but in reality you need a faster CPU / GPU rendering package like modern Intel / AMD or a NVIDIA Shield for real snappiness with demanding complex Kodi Skins. Using eMMC storage on AMLogic devices for LE / CE Kodi also helps:
https://forum.libreelec.tv/thread/9425-h...-for-data/

Ramp up the GPU MHz on LE / CE AML devices by SSH logging in and:

Code:
echo "echo 2 > /sys/class/mpgpu/scale_mode" >> /storage/.config/autostart.sh
then...
reboot

Reply
#23
I got a tanix TX9 s912 and copied the same Kodi setup. The improvement is pretty big. It's running aura with widgets pretty well. Would libreelec be a notable improvement over android?
Reply
 
Thread Rating:
  • 0 Vote(s) - 0 Average



Logout Mark Read Team Forum Stats Members Help
What is the CHEAPEST option for 4k hevc playback?00