FAQ: what hardware to get for 4K
#31
(2015-07-21, 23:08)DJ_Izumi Wrote: Without post processing, all decoding, software or GPU based, should all output the same image.

Yep - though post processing like de-interlacing, 4:2:0->4:2:2/4:4:4 resampling, scaling, 16-235 handling etc. are all issues you should also think about based on your sources.

The core decoding should 'just happen' - it's what happens to the decoded video that has more impact on picture quality.
Reply
#32
(2015-07-21, 23:08)DJ_Izumi Wrote: Without post processing, all decoding, software or GPU based, should all output the same image.

(2015-07-21, 23:29)noggin Wrote:
(2015-07-21, 23:08)DJ_Izumi Wrote: Without post processing, all decoding, software or GPU based, should all output the same image.

Yep - though post processing like de-interlacing, 4:2:0->4:2:2/4:4:4 resampling, scaling, 16-235 handling etc. are all issues you should also think about based on your sources.

The core decoding should 'just happen' - it's what happens to the decoded video that has more impact on picture quality.

A majority of movies that I really like I try to keep as a remux or blu ray rips, not sure if post processing is needed on those?.... but as far a post processing is concerned where needed. can that be processed by a GPU .... if so is it the better option over CPU?
Reply
#33
(2015-07-21, 23:45)markus3000 Wrote:
(2015-07-21, 23:08)DJ_Izumi Wrote: Without post processing, all decoding, software or GPU based, should all output the same image.

(2015-07-21, 23:29)noggin Wrote:
(2015-07-21, 23:08)DJ_Izumi Wrote: Without post processing, all decoding, software or GPU based, should all output the same image.

Yep - though post processing like de-interlacing, 4:2:0->4:2:2/4:4:4 resampling, scaling, 16-235 handling etc. are all issues you should also think about based on your sources.

The core decoding should 'just happen' - it's what happens to the decoded video that has more impact on picture quality.

A majority of movies that I really like I try to keep as a remux or blu ray rips, not sure if post processing is needed on those?.... but as far a post processing is concerned where needed. can that be processed by a GPU .... if so is it the better option over CPU?

There will be 4:2:0 to 4:2:2 or 4:4:4 chroma up sampling - and that can go wrong whether you use a CPU or a GPU to do it. Assuming you are watching 1080p stuff at 1080p there isn't much else to do.
Reply
#34
here's new article discussing 4K HDR and HDMI specs ...

http://4k.com/news/how-hdmi-2-0-helped-4k-advance-8437/

So just as were on the cusp of the next gen of SoC/media boxes finally supporting up to HDMI 2.0, it's still behind the curve... (although maybe the enhancement to 2.0a is just a s/w revision rather than hardware capability ?)
Reply
#35
Yeah, the upgrade from HDMI 2.0 to 2.0a should require a firmware change only, according to Anandtech:

anandtech Wrote:Fortunately, these static EDID extensions for HDR support can be added via firmware updates - no new hardware might be necessary for consumers with HDMI 2.0 equipment already in place.
Reply
#36
(2015-07-24, 11:42)oWarchild Wrote: Yeah, the upgrade from HDMI 2.0 to 2.0a should require a firmware change only, according to Anandtech:

anandtech Wrote:Fortunately, these static EDID extensions for HDR support can be added via firmware updates - no new hardware might be necessary for consumers with HDMI 2.0 equipment already in place.

Yes... But the firmware upgrade isn't going to suddenly make your display handle HDR content WITH HDR is it? If your panel can't deliver HDR dynamic range, even if you can persuade it to accept an HDR signal you will still have to do some conversion to reduce the dynamic range to something that can be displayed?

Of course if you have an HDR display, then that makes sense, the spec may not have been finalised before manufacture, so adding it to an HDR-compatible spec to allow HDMI 2.0a compatible sources to be handled makes sense.

BTW - AIUI one issue with HDR Rec 2020 stuff is that they are still working on the best way of down converting HDR to regular 8 or 10 bit video for display on non-HDR displays. That is apparently non-trivial.
Reply
#37
(2015-07-24, 11:52)noggin Wrote: Yes... But the firmware upgrade isn't going to suddenly make your display handle HDR content WITH HDR is it?
Oh, I didn't realise we were discussing displays Smile

(2015-07-24, 11:52)noggin Wrote: If your panel can't deliver HDR dynamic range, even if you can persuade it to accept an HDR signal you will still have to do some conversion to reduce the dynamic range to something that can be displayed?
If I understand correctly, HDR is sent as extra information over HDMI 2.0a. If you have a normal display it will just ignore any HDR data. In other words the HDR data is already separated from non-HDR data and only compatible TV's will be able to read HDR information and apply it.

cnet Wrote:In essence, the change is just specifications on how to transmit HDR metadata. That's information layered on top of the video image that tells the HDR-compatible display how to best take advantage of the greater color and contrast range in the underlying video image. So, for instance, a theoretical future 4K Blu-ray player can take that shadowy scene in a dark village and "tell" the display exactly how to render it in its HDR-enhanced glory.

source: http://www.cnet.com/uk/news/what-is-hdmi-2-0a/
Reply
#38
(2015-07-24, 12:09)oWarchild Wrote:
(2015-07-24, 11:52)noggin Wrote: Yes... But the firmware upgrade isn't going to suddenly make your display handle HDR content WITH HDR is it?
Oh, I didn't realise we were discussing displays Smile

(2015-07-24, 11:52)noggin Wrote: If your panel can't deliver HDR dynamic range, even if you can persuade it to accept an HDR signal you will still have to do some conversion to reduce the dynamic range to something that can be displayed?
If I understand correctly, HDR is sent as extra information over HDMI 2.0a. If you have a normal display it will just ignore any HDR data. In other words the HDR data is already separated from non-HDR data and only compatible TV's will be able to read HDR information and apply it.

cnet Wrote:In essence, the change is just specifications on how to transmit HDR metadata. That's information layered on top of the video image that tells the HDR-compatible display how to best take advantage of the greater color and contrast range in the underlying video image. So, for instance, a theoretical future 4K Blu-ray player can take that shadowy scene in a dark village and "tell" the display exactly how to render it in its HDR-enhanced glory.

source: http://www.cnet.com/uk/news/what-is-hdmi-2-0a/

Yep - HDMI 2.0 (not a) already supports Rec 2020 (12 bit 4:4:4 RGB for 2160/24-30p and 12 bit 4:2:2/4:2:0 YCrCb for 2160/24-60p) I guess the additional metadata added in HDMI 2.0a helps with the display of this ?

I guess you'd still need a Rec 2020 compatible display to display this content natively in Rec 2020, even if you don't have HDR, but if you don't have Rec2020 capabilities the source will have to do a conversion to ITU 709 or ITU 601.
Reply
#39
I had to google ITU 709 and ITU 601 Blush and yes I agree.
Reply
#40
(2015-07-24, 12:28)oWarchild Wrote: I had to google ITU 709 and ITU 601 Blush and yes I agree.

And joy of joy, not only does Rec 2020 have a new RGB to YCrCb relationship, to add to the existing ones we have for 601 and 709 (which themselves are different to each other), it also has different gammas for 10 and 12 bit, meaning conversion between 10 and 12 bit is not as easy as just ditching the two LSBs and dithering it appears. (10 bit has similar gamma to 709, but 12 bit doesn't)?

And Rec 2020 also has different RGB primaries.
Reply
#41
A lot of TVs (eg Sony range) already support DeepColor, which itself was added to the 1.3 HDMI spec. I know these TVs won't be able to read the extended HDR HDMI info but would they be able to benefit from any content encoded with HDR inmind ?
Reply
#42
(2015-07-24, 14:08)JustAnotherUser Wrote: A lot of TVs (eg Sony range) already support DeepColor, which itself was added to the 1.3 HDMI spec. I know these TVs won't be able to read the extended HDR HDMI info but would they be able to benefit from any content encoded with HDR inmind ?

I was never clear what Sony's support for Deep Color actually meant...
Reply
#43
It is a bit odd naming as according to wiki, it was Sony who proposed xvYCC , I guess DeepColor was more consumer friendly name.

"allows more than 8 bits per pixel for RGB 4:4:4 and (I guess) YCbCr 4:4:4"

is what I found, not quite the same as HDR I guess ?
Reply
#44
@JustAnotherUser, from what I found out Deep Color essentially means 10-bit (or higher) colour depths. So your TV should support 10-bit colour natively but I don't think it would take any benefit from HDR, which is luminosity related.
Reply
#45
(2015-07-24, 15:17)oWarchild Wrote: @JustAnotherUser, from what I found out Deep Color essentially means 10-bit (or higher) colour depths. So your TV should support 10-bit colour natively but I don't think it would take any benefit from HDR, which is luminosity related.

And do Sony DeepColor compatible TVs have panels with >8 bit depth capabilities?
Reply

Logout Mark Read Team Forum Stats Members Help
FAQ: what hardware to get for 4K3