Posts: 6,743
Joined: Oct 2008
Reputation:
317
noggin
Posting Freak
Posts: 6,743
2015-01-17, 12:49
(This post was last modified: 2015-01-17, 12:50 by noggin.)
Good spot!
Looking at that it seems to suggest the following :
5th Generation (Broadwell) Intel Core Processors with HD graphics 5500, HD graphics 6000, Iris graphics 6100 can do HEVC 10bit and 8bit
4th Generation (Haswell) Intel Core Processors with HD graphics 5000/4600/4400, Iris graphics 5100, Iris Pro graphics 5200 can do HEVC 8bit
To me that reads as if Broadwell and Haswell Core CPU/GPUs with the right GPU get HEVC either at both 8 and 10 bit or just 8 bit, but Pentium, Celeron and Core processors without the right GPU don't get even HEVC 8 bit? (Though some get VP9)
So if I read that correctly the Haswell NUCs with Core i5-4250U (HD Graphics 5000) and Core i3-4010U (HD Graphics 4400) may get 8bit hardware accelerated decode of HEVC (but not 10bit)? However, to me that also reads that the Haswell Celeron 2955U won't get HEVC hardware decoding at any bit depth?
(AIUI they are using shaders to accelerate some of the HEVC processing rather than having it fully implemented in a separate unit as H264, MPEG2 etc.?)
Also that's a Windows driver - I don't know how quickly that stuff migrates to Linux... (But interesting to try. I have some 2160/59.94p and 2160/50p files around from the BBC UHD DVB-T2 broadcasts in London last summer. Can't remember if they were 8 or 10 bit though...)
Posts: 10,519
Joined: Oct 2011
Reputation:
638
jjd-uk
Team-Kodi Member
Posts: 10,519
I would suspect this is hybrid decoding.
Hybrid decoding is where decoding is offload from CPU to GPU and runs as a mixture of software & hardware decode on the GPU. My understanding is that HEVC is an extension of H.264 it shares many of the same features, so the GPU can use hardware decode where the require decode process is the same as H.264, however there will still be a significant amount of software decode still required to be run on the GPU.
I'm not sure if Kodi will be able to take advantage of this as it may require using the Quicksync interface to the GPU.
Full HEVC hardware decode is supposed to be coming with the next generation Skylake architecture.
Posts: 6,743
Joined: Oct 2008
Reputation:
317
noggin
Posting Freak
Posts: 6,743
2015-01-17, 16:10
(This post was last modified: 2015-01-17, 16:12 by noggin.)
jjd-uk Suspect Windows versions of Kodi are more likely, in the first instance, to get HEVC via this route, than Linux/OSX? My understanding is that it is hybrid decoding too - the H264-similar functionality done in the VPU bit with shader/EUs doing the H265 specific stuff? I presume the lower-end GPUs don't have enough EUs to cope (or they want to differentiate Core series still?)
I've just started playing on an i5 Haswell NUC with MPC-HC under Win 7 64bit and it appears to work with Big Buck Bunny 720p and 1080p HEVC clips (i.e. I get a H/W flag when playing them with the new drivers if I enable HEVC in the Lav Video decoder Hardware acceleration section and select DXVA 2 native.)
However my DVB-T2 .ts recordings of HEVC 2160/50p and 2160/60p off-air don't play (though ffmpeg doesn't want to remux them to mp4 or mkv so they may be a bit fishy) I get a grey-screen playing them in MPC-HC with HEVC hardware decoding, and lots of frame drops if I disable hardware decoding. Early days though.
Am downloading BBB 4K now to see how that plays.
Posts: 284
Joined: Mar 2011
Reputation:
1
Has the hybrid decoding been implemented in the Windows Kodi? Better than nothing for us Broadwell NUC users.