• 1
  • 111
  • 112
  • 113(current)
  • 114
  • 115
  • 138
Android nVidia Shield TV (2015 & 2017 Models) - UPDATED: May 25, 2018
(2019-01-02, 17:23)Hitcher Wrote:
(2019-01-02, 16:06)FoxADRIANO Wrote: Here is what you asked. The first 2 files are original and I see them very well on my PC monitor.
https://www.dropbox.com/s/dbygzia2rypbrz...b.mp4?dl=0
https://www.dropbox.com/s/xa630d26spg5tp...K.mp4?dl=0
The tests refer to 4K file. With 1080 it is even worse.
If I watch those files on my PC monitor I see them perfectely but with Sony TV I see very bad colours.
This is what I see on my Sony TV:
https://www.dropbox.com/s/bxulfvaw7s8b2m...V.mp4?dl=0
https://www.dropbox.com/s/au5rhgdvjgi6xw...V.mp4?dl=0
https://www.dropbox.com/s/cj2xlb69lic905...V.mp4?dl=0
Now I'm using using Standard HDMI.
It's definitely your TV as the colours are perfect on mine. Just a thought but have you calibrated the colours for different source types? 

Yes - I was about to ask the same thing.  I assume you've disabled all the terrible picture processing that is enabled by default and are in a sensible picture mode?

Do you want to PM me your Picture settings? Posting them here may derail the thread.
Reply
OK
Reply
(2019-01-02, 04:08)wesk05 Wrote:
(2019-01-01, 05:28)hdmkv Wrote: Yes, it does. I get HDR signal even with 8-bit, but when it's indicating it's getting 10- or 12-bit, PQ has more pop (or maybe it's placebo). Know @wesk05 has mentioned that several displays get the bit depth wrong (even when actually rendering 10-bit), so who knows. I'm thinking of investing in a HD Fury Vertex.
 With 4:2:2 if you don't get banding, you can safely ignore the 8-bit being reported by AVRs, projectors and HDFury devices (HDFury reports "upto 12-bits"). None of these devices report the actual bit depth in use with 4:2:2 pixel packing format. 
No banding. Tested w/the usual suspect... 'The Martian' UHD. Thought HDFury reported accurately on color space & bit depth; long thread on AVSForum about how great a device it is.
Reply
(2019-01-03, 04:02)hdmkv Wrote: No banding. Tested w/the usual suspect... 'The Martian' UHD. Thought HDFury reported accurately on color space & bit depth; long thread on AVSForum about how great a device it is. 
It's a great device for $300. However, the FPGA isn't capable of detecting the actual bit depth of 4:2:2 pixel packing. It also doesn't have 1Hz accuracy when measuring pixel clock. Its accuracy is in kHz. The calculated refresh rate is therefore not absolutely correct. I have alluded to this fact in one of my posts in the Zidoo thread.
Reply
(2019-01-03, 04:02)hdmkv Wrote:
(2019-01-02, 04:08)wesk05 Wrote:
(2019-01-01, 05:28)hdmkv Wrote: Yes, it does. I get HDR signal even with 8-bit, but when it's indicating it's getting 10- or 12-bit, PQ has more pop (or maybe it's placebo). Know @wesk05 has mentioned that several displays get the bit depth wrong (even when actually rendering 10-bit), so who knows. I'm thinking of investing in a HD Fury Vertex.
 With 4:2:2 if you don't get banding, you can safely ignore the 8-bit being reported by AVRs, projectors and HDFury devices (HDFury reports "upto 12-bits"). None of these devices report the actual bit depth in use with 4:2:2 pixel packing format.    
No banding. Tested w/the usual suspect... 'The Martian' UHD. Thought HDFury reported accurately on color space & bit depth; long thread on AVSForum about how great a device it is.   

The HD Fury reports the HDMI format being sent by the source, but not the contents of that video.  

2160p 4:2:2 signals are always carried as 12-bit over HDMI - that is the only bit-depth the HDMI format supports at 12-bit 4:2:2 at 2160p.  The HDMI standard doesn't allow for a signal to be sent in 8-bit, 10-bit or 16-bit 4:2:2 at 2160p, only 12-bit.  It's a quirk of the HDMI standard.  Other formats, like RGB/4:4:4 (at 2160p/23.976-30p) and 4:2:0 (at 2160/50-60p) support multiple bit-depths, 4:2:2 only supports 12-bit.  Conversely 4:2:2 12-bit is the only non-8 bit format (i.e. that can properly carry HDR) that exists for both <30Hz and >30Hz formats.  

https://www.hdmi.org/manufacturer/hdmi_2...q.aspx#146 

If a device wants to send 2160p 4:2:2 but at a lower bit depth, it still sends an HDMI signal in 4:2:2 12-bit format, but if it is sending 8-bit video, it sends it padded with 4 bits of zeroes as LSBS, or if it is sending 10-bit video, it sends it padded with 2 bits of zeroes as LSBs. The video signal carried is always 12-bit, it just may not be carrying video that is 12-bit.  For an HDMI analysis device to detect this it would have to do some analysis of the actual video content.  The HD Fury doesn't really do that - it tells you what the source is flagging the video as, not what the content is.

Displays that signal 8-bit 4:2:2 or 10-bit 4:2:2 at 2160p are presumably checking a frame or two to see if there is any content in the LSBs of the signal, as the signal they are receiving, if it is HDMI compliant, will always be a 12-bit signal.
Reply
OK - I have the 7.2.2 Engineering test image after signing up for it a day or so ago.

I have my Shield TV configured for 2160p50 4:2:2 12-bit Rec 2020 output in display settings, and have enabled the colorimetry switching option in Developer options.

The Home screen and Android UI are now output as 2160p50 4:2:2 12-bit Rec 709, as is the Rec 709 video from SDR HD apps like iPlayer, SVT Play etc. 

UHD HDR10 content from Amazon Prime and Netflix is output as 2160p50 4:2:2 12-bit Rec 2020 with a ST.2084 EOTF flagged and metadata.  When you return to the home screen the UI switches back to Rec 709 SDR.

UHD SDR content (The Crown, 1983) from Netflix is output as 2160p50 4:2:2 12-bit Rec 709 - which I'm not sure is correct (Is Netflix SDR content Rec 709 SDR or Rec 2020 SDR or is it player specific?) 
(I'll check the Apple TV 4K to see what it does with these UHD SDR titles as a comparison - though with OTT providers you can't tell for sure what streams are being played - though the ATV Dev HUD is very useful)

*** EDIT - have checked the Apple TV 4K. It outputs Netflix UHD SDR content like 1983 and The Crown as Rec 709 UHD too, so the Shield TV is doing the same thing. ***

In Kodi Leia 18.0-Beta4 (which is what I currently have installed) - Rec 709 SDR content is output flagged as Rec 709,  Rec 2020 HDR10 content is output as Rec 2020 with an ST.2084 EOTF, Rec 2020 HLG content is output as Rec 2020 SDR (as expected as the Shield TV doesn't flag HLG over HDMI AFAIK)  Frame rate switching happens as before.

This looks - superficially - as a good solution for Kodi.

@wesk05 are you able to check whether the 709 output is as expected?

(Now all we need is for OTT app authors to properly frame rate switch in Android...)
Reply
(2019-01-03, 12:15)noggin Wrote: @wesk05 are you able to check whether the 709 output is as expected?
I will be able to do that only by the end of this month when I finally move into my new house. I had to push back the move-in couple of times due to unexpected construction delays.
Reply
(2019-01-03, 17:58)wesk05 Wrote:
(2019-01-03, 12:15)noggin Wrote: @wesk05 are you able to check whether the 709 output is as expected?
I will be able to do that only by the end of this month when I finally move into my new house. I had to push back the move-in couple of times due to unexpected construction delays. 
 Good luck - hope all goes well.
Reply
This chroma clipping bug is solved then?
https://forums.geforce.com/default/topic...2/#5837042
Reply
(2019-01-04, 15:56)djnice Wrote: This chroma clipping bug is solved then?
https://forums.geforce.com/default/topic...2/#5837042
 Can you point me to a source of that test image?
Reply
Already commented about it here: https://forum.kodi.tv/showthread.php?tid=339023

but I hope Kodi devs put back Lanczos3, Spline36 and Yadif back into Kodi 18 on Android. You can enable those in SPMC if you disable hardware acceleration (Media Codec or w/e its called), it will then look much better than the hardware acceleration turned on on NV Shield TV, since we all know how bad the upscaling quality is on NVshield for all content below 4K, even 1080p.

And Lanczos3 will usually even look better than like 95% of the TVs native upscaling out there.
Reply
Here is the test pattern: https://forums.geforce.com/default/topic...luation/1/
Reply
(2019-01-05, 09:55)djnice Wrote: Here is the test pattern: https://forums.geforce.com/default/topic...luation/1/
Quick check on that image suggests clipping still. This is with it output at 1080p23.976 Rec 709 YUV 12-bit.

Very quick check with PLUGE suggests <16 is being clipped on my display.

(Should also add that MrMC seems to do the same on the Apple TV 4K - though that outputs 2160p23.976 444 BT709 8-bit as it doesn't have whitelisting)

And I stress that these are quick tests.  

When I get a chance I'll check with better test signal files.
Reply
I’m sure I read that the Shield has fixed the high bitrate atmos dropouts, will this be implemented in Kodi?
Reply
(2019-01-05, 13:58)noggin Wrote: Quick check on that image suggests clipping still. This is with it output at 1080p23.976 Rec 709 YUV 12-bit.

Very quick check with PLUGE suggests <16 is being clipped on my display.
It used to passthrough super whites/blacks with YCbCr output, but that was changed in one of the updates (I believe the one that introduced refresh rate switching).
Reply
  • 1
  • 111
  • 112
  • 113(current)
  • 114
  • 115
  • 138

Logout Mark Read Team Forum Stats Members Help
nVidia Shield TV (2015 & 2017 Models) - UPDATED: May 25, 20188