2016-05-20, 21:58
(2016-05-20, 21:08)Uoppi Wrote: Just a thought: my video is connected via HDMI to TV while audio goes via optical and quality outboard DAC to analog stereo amp. I assume this could be the source of the the varying clock deviation figures but would someone be able to explain how exactly? Is the same audio clock of an HTPC used for all outputs, for example?The audio clock in this case is in your motherboard audio chip. The video clock could be a in separate GPU or in an onboard GPU, but it's different from the audio one nonetheless.
When the first videocards with onboard audio through HDMI came out I hoped that they would be using a single clock for both audio and video but, alas, I'm not aware of any that does it.
I used to have spdif out and VGA out, back then. But it was before I really started caring about smoothness in video reproduction. So I'm really in the dark regarding how that influences clock deviation measurements, sorry.