Skylake: "only ... decode 8-bit HEVC" --> Kaby Lake: "full HEVC/10-bit decode"
#1
Thought some here might find this information useful/interesting:

"For example, in a recent slide at its 2014 developer conference discussing the challenges of 4K content, the company indicated that some content will require that the hardware can decode 10-bit HEVC content. In the same slide, Intel indicated Skylake would only be able to decode 8-bit HEVC content but that its 2016 platform would bring full HEVC/10-bit decode."

http://www.fool.com/investing/general/20...th-it.aspx
Reply
#2
This might be the reason why HDMI 1.4 and not 2.0 is used for braswell barebones.
8-bit HEVC is better than nothing but to get HDR and to make use of larger color gamuts, 10-bit is required imo.
Reply
#3
On the other hand, Tegra X1 can decode Main 10 Profile.
Shame though as when 10 bit HDR content is out there on blu ray Shield probably wont be able to play it.
Reply
#4
Most broadcast content is shot with 10 bit accuracy these days, even ignoring HDR. 10 bit HEVC has benefits just to avoid banding. YCrCb means that saturated blue has very low levels of detail in 8 bit quantisation, and is much improved with 10 bit.
Reply
#5
Regarding the Braswell, according to hardware.fr, it can decode 10-bit HEVC up to 4K 30Hz, but not 60Hz:

hardware.fr Wrote:Note that the HEVC decoder Intel GPU handles both the Main and Main10 profiles (10 bits) and is currently limited to Level 5 (100 Mbit/s 4K 30Hz for Main10 profile). Acceleration does not allow to go higher, the Blu-Ray standard 4K can theoretically go up to 60 Hz for some content (as a reminder, the 4K video outputs are also somehow limited to 30 Hz on Braswell cards and do not support HDMI 2.0 / HDCP 2.2).

Source: http://www.hardware.fr/articles/936-5/gp...4-265.html (translated by google)
Reply
#6
So If I wanted to replace my desktop PC to support 10-bit HEVC 60hz I need to wait until sometime next year and get the kaby lake processor? Although I don't NEED a new PC at this time I do want one and have used the same desktop for 6-7 years now.
Reply
#7
nobody considering AMD?
Reply
#8
(2015-07-11, 00:24)cstmstyle Wrote: So If I wanted to replace my desktop PC to support 10-bit HEVC 60hz I need to wait until sometime next year and get the kaby lake processor? Although I don't NEED a new PC at this time I do want one and have used the same desktop for 6-7 years now.
I would be keeping my eye on the MINIX guys for cheap 10-bit HEVC if they end up releasing a media player based on the new AMlogic S905/S912 SoC with these specs:

http://www.cnx-software.com/2015/03/27/a...s905-s912/

Reply
#9
(2015-07-11, 05:42)chilman408 Wrote: nobody considering AMD?

No because despite being decent hardware AMD GPU's are ruined by having the worst software drivers on the planet.
Reply
#10
(2015-07-11, 00:24)cstmstyle Wrote: So If I wanted to replace my desktop PC to support 10-bit HEVC 60hz I need to wait until sometime next year and get the kaby lake processor? Although I don't NEED a new PC at this time I do want one and have used the same desktop for 6-7 years now.

Not necessarily, this thread is about GPU Hardware support. Powerful CPUs can software decode 10-bit HEVC. In January Intel released a driver update that gives Broadwell core processors hybrid (?) support for 10-bit HEVC. Not sure if this covers 10-bit HEVC 4K@60hz, though.

EDIT: you would also need HDMI 2.0 / DisplayPort 1.2 in order to drive a TV / monitor at 4K@60Hz.
Reply
#11
(2015-07-11, 12:20)oWarchild Wrote:
(2015-07-11, 00:24)cstmstyle Wrote: So If I wanted to replace my desktop PC to support 10-bit HEVC 60hz I need to wait until sometime next year and get the kaby lake processor? Although I don't NEED a new PC at this time I do want one and have used the same desktop for 6-7 years now.

Not necessarily, this thread is about GPU Hardware support. Powerful CPUs can software decode 10-bit HEVC. In January Intel released a driver update that gives Broadwell core processors hybrid (?) support for 10-bit HEVC. Not sure if this covers 10-bit HEVC 4K@60hz, though.

EDIT: you would also need HDMI 2.0 / DisplayPort 1.2 in order to drive a TV / monitor at 4K@60Hz.

Yup already looked into that. Nothing really in my price range for a video card that would drive a 10 bit 4K TV/Monitor at 60hz with 4:4:4 chroma. I might have been confused by this thread thinking I need a CPU capable of HEVC 4K/60hz at 10bit. In my mind it seems that would be a benefit to have but maybe a GPU would make it unnecessary.
Reply
#12
I've just stumbled upon the Intel Graphics for Linux - Hardware Specification - Programmer's Reference Manuals (PRM) and in the Braswel's Volume 10: High Efficiency Video Coding (HEVC) I've found the following:

Quote:Supports NV12 video buffer plane:
• Supports 4:2:0, 8 - bit per pixel component (Y, Cb and Cr) video.

Quote:HEVC Decoder Features
• Supports full-featured HEVC Main Profile standard, up to Level 6.2.

No mention of 10-bit or main10 profile. So it does seem like Braswell won't support 10-bit hardware decoding (in linux at least).
Reply
#13
10bit seems to be important for blueray 4k ..it's true ?

http://forum.blu-ray.com/showpost.php?s=...ostcount=3
Reply
#14
(2015-07-14, 23:13)Roby77 Wrote: 10bit seems to be important for blueray 4k ..it's true ?

Yes indeed, 4K Blu-ray will support 10-bit: redsharknews - Massive upgrade to Blu Ray: 4K, HDR, 10 bit, Rec 2020 colour space.

10-bit will also be used in 4K TV broadcasts (Rec. 2020) and Netflix.
Reply
#15
Didn't want to start a new thread since i saw this so my question is:

For a 4k capable 10 bit HEVC/H.265 HTPC would i be good to go with a spec shown below

Intel Celeron G1840
mITX board with a full PCIex 16 slot
Nvidia GTX 950
4GB DDR3 1600 RAM
OS : Windows 8.1 or 10 64 bit
Reply

Logout Mark Read Team Forum Stats Members Help
Skylake: "only ... decode 8-bit HEVC" --> Kaby Lake: "full HEVC/10-bit decode"1