• 1(current)
  • 2
  • 3
  • 4
  • 5
  • 7
  •   
A serious examination of 1080p hi10p hardware requirements
#1
I know that this is not of such high interest to everyone, but it seems that, pending newer standards, 1080p hi10p is sort of the 800 pound gorilla in the room in terms of hardware requirements. I personally wrote to NVIDIA some months back to ask if they had plans to implement GPU processing for this profile and they said "no" in no uncertain terms.

I want to get a general idea of what hardware people are using to decode H.264 hi10p profile with 1080p resolution. Just to sort of keep track of what's necessary, what the builds are, cost, etc.

This is partly due to my trusty old acer aspire revo R3610 finally deciding to start dying - which means that I have to build/buy something else. AMD apu? core i3? Thoughts?
Reply
#2
I use a 2009 Core2duo MacBook Pro that can software decode 1080 Hi10P. An Intel Celeron G530 and higher should also be a safe bet.
Reply
#3
You want 1080 Hi10P, the sw decode is your man and you will need big iron for that. Welcome back the noisy beast back into the mediaroom. So sad, I thought we banished them, never to be seen again.
Reply
#4
We have enabled multithreaded decoding for Hi10p. Celeron 847 1.1Ghz like in Zotac ID42 can handle it. (reported by Linux users).
Reply
#5
Interesting - I was under the impression that the version of FFMPEG that XBMC uses did not support multithread software decode. Is that included in frodo?
Reply
#6
How about for AMD cpu's? Has anyone had any success with software decode on AMD platform?
Reply
#7
I have an A8-3870K at 3.0ghz, with single core 10bit decoding this was enough for most 10bit content, however a few rare scenes caused issues, mostly those that featured a lot of CG static/grain or other extremely noisy video.

With the multicore nightlies, it's more than enough to handle everything thrown at it. So I'd rather confidently say that an AMD with dual 3.0ghz will be more than enough.

My roommate has asked me to order parts to build her one, her old WDTV box hit the wall with header compression in MKVs and 10bit. I ordered an A6-5400K, dual 3.6ghz processor and it should have no problem doing 10bit.

My greater concern is h.265 and VP9, which will likely be CPU only for a while before any kind of hardware solution comes around. HOPEFULLY the A6-5400K will be able to decode it but at least it won't be more than a CPU upgrade from working. I'm pretty confident that the A8-3870K will be fine for h.265 and VP9 though. ...At least for 1080p, 4K will be another animal.

I think 10bit isn't all you should focus on. Build a machine that's 'just enough' for 10bit and you'll be in the same place again in 12-24 months dealing with h.265 and/or VP9. Might as well build or at least be prepared for those. I figure that if you're gonna build a box, spend a bit more and build one to last 5-8 years.
Reply
#8
For now I've picked up an AMD A-350 APU. Mobo/CPU + memory + case + 64gb SSD was like just over $300. We'll see how that does, and I'll post once it's done!
Reply
#9
(2013-05-31, 01:54)blm14 Wrote: For now I've picked up an AMD A-350 APU. Mobo/CPU + memory + case + 64gb SSD was like just over $300. We'll see how that does, and I'll post once it's done!

Yeah, that's totally not gonna play 10bit h.264. There just isn't nearly enough power in there.

As an update of my own testing, I build an XBMCbuntu box with the dual core 3.6ghz A6 5400K and it is suitable for 10bit h.264. 95% of the time one core is enough, sometimes it needs both cores. Running 13.0 Alpha 3 due to it supporting multithreaded CPU decoding.
Reply
#10
AMD e-350 can't even play 1080p without GPU acceleration. 720p is ok.
Reply
#11
From what i have read in here Core i3 3225 is more than enough for 10bit and H265?
Reply
#12
(2013-05-31, 12:57)solamnic Wrote: From what i have read in here Core i3 3225 is more than enough for 10bit and H265?

10bit, sure. h.265? It's really hard to say. One important factor will be the quality and efficiency of the decoder. Any decoder in ffmpeg will likely need time to mature with performance increasing as it gets refined. ffmpeg for h.264 in XBMC was a lot like that. I remember updating XBMC on my classic Xbox and seeing 480p h.264 performance increase over time.
Reply
#13
Depends on resolution and coding levels too. Are we talking h265 576p or h265 8k? Does anyone have any h265 sample material?
Reply
#14
It's playing regular 1080p just fine, actually. Smile But yeah hi10p is still a no go at full HD, but it also plays 720p hi10 without problems
Reply
#15
(2013-06-01, 13:15)blm14 Wrote: It's playing regular 1080p just fine, actually. Smile But yeah hi10p is still a no go at full HD, but it also plays 720p hi10 without problems

That's because the GPU is doing the 8bit stuff, it's CPU isn't capable of doing 8bit 1080p on it's own either.

As for 720p, I suggest you test that farther. In my experience on an E-350 the 720p will MOSTLY work but certain highly complex scenes start to stutter due to their higher computational demand.
Reply
  • 1(current)
  • 2
  • 3
  • 4
  • 5
  • 7
  •   
 
Thread Rating:
  • 0 Vote(s) - 0 Average



Logout Mark Read Team Forum Stats Members Help
A serious examination of 1080p hi10p hardware requirements00