• 1
  • 3
  • 4
  • 5
  • 6(current)
  • 7
A serious examination of 1080p hi10p hardware requirements
#76
(2016-03-13, 03:45)Matt Devo Wrote:
(2016-03-13, 03:26)wrxtasy Wrote: Looks like the recent ffmpeg optimisations have benefitted all platforms with decent CPU's for Software HEVC and Hi10P decoding.

Added this info to the Pick the Right Kodi Box.

What 1080p HEVC Bitrates are we talking here ?

got a link to some test clips handy? If so, I'll test and let you know Smile

edit: tested some of the samples linked from the samples wiki page, and it's hard to make a generalization based on bitrate and framerate because encoding profiles differ so greatly. I can say that all of the 1080p 24/30 fps samples linked from http://www.libde265.org/downloads-videos/ played without a problem, but the two "H.265 1080p (medium bitrate)" test files linked directly from the wiki page (also 24fps) pegged both CPU cores at nearly 100% and were not smooth to watch. If I had to generalize, it seems like most <10Mbps 24/30fps content wasn't a problem.
What is probably a more reliable indicator for HEVC sample files is for them to be throw into a file Analyser like Mediainfo to get the "Video Stream BitRate (Nominal)" in Kbps.

Then we can quantify what is Low, Medium and High - 1080p bitrate HEVC playback. Everyone has a different interpretation at the moment which is the problem for Software decoding comparisons.

@matt Devo, what are you classifying as Low, Medium and High bitrate HEVC ?

Reply
#77
I think Matt was referring to the description "medium" in the wiki samples page. And as Matt pointed out its not just bitrate that matters, encoding profiles are very important.
If I have helped you or increased your knowledge, click the 'thumbs up' button to give thanks :) (People with less than 20 posts won't see the "thumbs up" button.)
Reply
#78
Too many variables in play I think to classify anything accurately other than to say Hardware decoding of any codec provides peace of mind and certainty that it will get the job done properly. Smile

Reply
#79
Provided the hardware makers get it right of course. And there will always limits to what a hardware decoder will do, in terms of profile, frame size and bitrate. So hardware might decode up to profile X, but that's no good if your encoder if using profile Z.
If I have helped you or increased your knowledge, click the 'thumbs up' button to give thanks :) (People with less than 20 posts won't see the "thumbs up" button.)
Reply
#80
(2016-03-13, 07:38)wrxtasy Wrote: Too many variables in play I think to classify anything accurately other than to say Hardware decoding of any codec provides peace of mind and certainty that it will get the job done properly. Smile

Yes - though there are codec families that you can only decode in software (ProRes, DNXHD, DVC Pro HD pro codecs, and 4:2:2 variants of MPEG2 and H264 usually) - so in some cases you have to go for CPU decode. Then a number of things really play a role. Obviously the profile and bitrate the encoder has run at, and the overall complexity of the decoding process, but more importantly for Kodi is how well the decoding algorithm has been implemented and optimised in ffmpeg, and in particular whether it exploits multi-threading. H264 4:2:2 40Mbs 1080/50i went from unplayable on a quad core Ivy Bridge i5 NUC to playable with YADIF 2x CPU deinterlacing when multithreaded decode arrived.

My advice for software decode is usually to throw a bit of processing at it - don't go for the bare minimum. Leave yourself some headroom.
Reply
#81
Yes same recommendation I make for the i Series NUC's in the Pick the Right Kodi Box thread.
Throw Software decoding and CPU cycles at the problems if you have varied and very specific decoding requirements.

Reply
#82
(2016-03-14, 11:53)noggin Wrote:
(2016-03-13, 07:38)wrxtasy Wrote: Too many variables in play I think to classify anything accurately other than to say Hardware decoding of any codec provides peace of mind and certainty that it will get the job done properly. Smile
...
H264 4:2:2 40Mbs 1080/50i went from unplayable on a quad core Ivy Bridge i5 NUC to playable with YADIF 2x CPU deinterlacing when multithreaded decode arrived.

So that I can take as an answer for my question here too. A core i5 is minimum to achive H264 4:2:2 40Mbs 1080/50i decoding in software or would a core i3 be enough? Mean is a i5 that headroom to have or is it with a i3 and multithreaded decoding in software enough headroom?

Thanks again for all your support.
Regards
Vlaves
Reply
#83
(2016-03-13, 07:38)wrxtasy Wrote: Too many variables in play I think to classify anything accurately other than to say Hardware decoding of any codec provides peace of mind and certainty that it will get the job done properly. Smile

true, but the Haswell/Broadwell Celerons can handle BBB 1080p/24 HEVC with <50% CPU, zero drops/zero skips. They can also handle a certain recent "release" taking place a long time ago, in a galaxy far, far away, which is HEVC 10-bit at 1920x800/24fps, without any skips or drops.
Reply
#84
(2016-03-14, 21:04)Vlaves Wrote:
(2016-03-14, 11:53)noggin Wrote:
(2016-03-13, 07:38)wrxtasy Wrote: Too many variables in play I think to classify anything accurately other than to say Hardware decoding of any codec provides peace of mind and certainty that it will get the job done properly. Smile
...
H264 4:2:2 40Mbs 1080/50i went from unplayable on a quad core Ivy Bridge i5 NUC to playable with YADIF 2x CPU deinterlacing when multithreaded decode arrived.

So that I can take as an answer for my question here too. A core i5 is minimum to achive H264 4:2:2 40Mbs 1080/50i decoding in software or would a core i3 be enough? Mean is a i5 that headroom to have or is it with a i3 and multithreaded decoding in software enough headroom?

Thanks again for all your support.
Regards
Vlaves

All 4 threads (Dual Core with Hyperthreading mean 4 CPUs reported) sit at well over 50% so I think it would be possibly too much for a similar low-power i3. Can't say for certain as I don't have any low-power NUC-style i3 set-ups. However this is 4:2:2 not Hi10.
Reply
#85
tested samples from here: https://www.koi-sama.net/files/hi10/ on my Core-M 5Y10

720p OK with ~30% peak 60%, 1080p avg 70% peak 100% (all 2 cores, 4 threads)


also, i could play 720p quite fine on my Galaxy K Zoom (Exynos Hecaore) with VLC

no chance for 1080p, of course! anyone tested on latest high-end smartphone like S7?
Reply
#86
(2016-01-28, 18:13)Ned Scott Wrote: I finally got around to testing a 1080 Hi10P video on the Nvidia Shield TV and WeTek Core. They both played the file without any issues, using software decoding. Everything was smooth and there were no skips, stutters, or drops, as far as my eyes can tell.

I'll see if I can find some more demanding Hi10P samples, as different groups can encode differently, etc. I know some people said they had issues, so if you have a specific file you want me to test, feel free to PM me.

is that mean all AMLogic S812-H will do 1080 Hi10P or depend on device ?
Reply
#87
No AMLogic can Hardware decode H264 Hi10P.
AML can software decode some 720/1080p Hi10P but there are still to many problems when tested with more than a single video sample. Its not reliable at all.

You really do need proper Intel Hardware.

Reply
#88
H264 Hi10 4:2:0 is a dead-end codec. It's never got hardware acceleration support (a Rockchip SoC was rumoured to have hardware support - but Rockchip Linux support is lousy), and so the only real solution is CPU decode - which really means a decent x86 platform at the moment.

Hopefully the perceived benefit of H264 Hi10 (reducing banding on Anime re-encodes AIUI?) will transfer to H265 10 bit so that is used instead - which DOES have hardware SoC support (AML S905 does it for instance) in the future. We could then finally stop having to answer the same questions. Again. And Again. And Again.
Reply
#89
(2016-05-23, 14:10)noggin Wrote: H264 Hi10 4:2:0 is a dead-end codec. It's never got hardware acceleration support (a Rockchip SoC was rumoured to have hardware support - but Rockchip Linux support is lousy), and so the only real solution is CPU decode - which really means a decent x86 platform at the moment.

Hopefully the perceived benefit of H264 Hi10 (reducing banding on Anime re-encodes AIUI?) will transfer to H265 10 bit so that is used instead - which DOES have hardware SoC support (AML S905 does it for instance) in the future. We could then finally stop having to answer the same questions. Again. And Again. And Again.

Where I'm at a loss is as to how this hasn't happened yet. Some in the anime scene want the leanest and meanest compression, which is why it's really the only scene you see Hi10 in use. With HEVC/x265 available readily, why have a good number no jumped to that ship already?
Reply
#90
lol i can even play DVD-resolution HEVC on my Pentium-M 2,13Ghz with 50% average cpu load but Hi10p lags while using 100%

such a dead born codec...
Reply
  • 1
  • 3
  • 4
  • 5
  • 6(current)
  • 7

Logout Mark Read Team Forum Stats Members Help
A serious examination of 1080p hi10p hardware requirements0