2010-05-06, 20:31
I've searched and searched and I've yet to come across a clear answer my questions. It may seem redundant, or been answered before. And if so, please point me in the right direction. So here goes:
I have an Samsung Model 46A650 TV attached to my HTPC running Windows 7 and XBMC. I have an ATI Radeon 4350 video card which is connected to the TV via HDMI.
In the Catalyst driver settings, I can choose from several pixel formats, as most everyone is probably familiar with.
My (limited) understanding of colorspace and pixel formats lead me to believe that to match the studio color levels (16-235) of video playback, I need to select the YcbCr 4:4:4 Pixel Format, as this is what I understand a television normally expects to be input.
However, on my television, I can change the label of the HDMI 2 input to "PC", which as I've read tells the television to expect PC level RGB (0-255).
So, in short, my television can display both 0-255 and 16-235 levels. I don't experience black crush or anything like that, because the TV is getting what it's expecting.
And I currently the output set to the RGB 4:4:4 Pixel Format PC Standard (Full RGB), along with having HDMI 2 set to PC Mode. It's crisp and clean, as to be expected, because I assume that it's not doing any processing of the video signal and acting like a PC monitor.
My concern however is that since I have the output set to Full RGB, am I losing color quality/depth in my actual video playback? If I have the YCbCr pixel format selected, the desktop and XBMC don't look as sharp, but I feel like the video looks better. I may be crazy.
And after that long diatribe, here are my questions:
1.) What colorspace/pixel format is XBMC using for video playback, and should I match the output pixel format to that?
2.) Am I getting YCbCr and RGB mixed up? Is one better than the other for video playback?
2.) Am I worrying about something I shouldn't because there's some magical conversion being done behind the scenes (or because I'm a perfectionist?)
Thanks for reading this, and I'd really appreciate any feedback you can give.
I have an Samsung Model 46A650 TV attached to my HTPC running Windows 7 and XBMC. I have an ATI Radeon 4350 video card which is connected to the TV via HDMI.
In the Catalyst driver settings, I can choose from several pixel formats, as most everyone is probably familiar with.
My (limited) understanding of colorspace and pixel formats lead me to believe that to match the studio color levels (16-235) of video playback, I need to select the YcbCr 4:4:4 Pixel Format, as this is what I understand a television normally expects to be input.
However, on my television, I can change the label of the HDMI 2 input to "PC", which as I've read tells the television to expect PC level RGB (0-255).
So, in short, my television can display both 0-255 and 16-235 levels. I don't experience black crush or anything like that, because the TV is getting what it's expecting.
And I currently the output set to the RGB 4:4:4 Pixel Format PC Standard (Full RGB), along with having HDMI 2 set to PC Mode. It's crisp and clean, as to be expected, because I assume that it's not doing any processing of the video signal and acting like a PC monitor.
My concern however is that since I have the output set to Full RGB, am I losing color quality/depth in my actual video playback? If I have the YCbCr pixel format selected, the desktop and XBMC don't look as sharp, but I feel like the video looks better. I may be crazy.
And after that long diatribe, here are my questions:
1.) What colorspace/pixel format is XBMC using for video playback, and should I match the output pixel format to that?
2.) Am I getting YCbCr and RGB mixed up? Is one better than the other for video playback?
2.) Am I worrying about something I shouldn't because there's some magical conversion being done behind the scenes (or because I'm a perfectionist?)
Thanks for reading this, and I'd really appreciate any feedback you can give.