2009-01-21, 09:01
mr_raider Wrote:The point is any hardware that can play 720p, can probably play 1080i.
At any rate, the hardware requirements have nothing to with screen resolution, they should be determined by the bandwidth of the content that is being decoded.
First, 1080i deinterlaces more cleanly to 1080p than 720p. So having a 1080p set is not useless even if only used for 1080i content. That was part of my my initial argument to timgray for why 1080p isn't useless. I wasn't arguing that 1080i was more CPU intensive to decode than 720p.
Second, hardware requirements *do* depend on output resolution, since not only must the CPU decode the video stream (relatively easy), but it must also scale and render that stream *plus* any UI overlays using OpenGL.
example: the 45Mbit 1080p x264 killa sample file would playback with 0 dropped frames on my C2D E6600 on my old 1680x1050 LCD, but drops frames heavily on my new 1920x1200 LCD (using XBMC 8.10 for windows, full screen)