Kodi Community Forum

Full Version: Kodi playing 1920 x 1080 as 1919 x 1080?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
I was using Kodi (well, latest OpenElec stable on an HP Chromebox) to do some tests on a television using a 1920x1080 black-and-white individual pixel checkerboard, and I noticed I was getting some odd artifacts in the middle of the picture that gradually spread outwards. Like, the checkerboard was not clean, and was being resampled. Zoom ratio is set at exactly 1.0, and the TV is in no-overscan mode.

In trying to track down this issue, I noticed that if I changed the resizing method to "nearest neighbor" the larger resampling problem disappeared, but then the checkerboard had an alignment error dead center - like it was missing a vertical line.

And that's when I discovered what the real problem was. If I had the View Mode set to Normal / Custom / Original, Kodi was slightly squishing the video horizontally- by exactly 1 pixel that came off the right edge. But if I changed to "Wide Zoom" or "16x9 Stretch", then presto - the unused row of pixels on the right edge disappeared and the checkerboard snapped into pixel-perfection.

It's not just that one video file either - it's miscalculating the playback dimensions of every single 1920x1080 video I have as 1919x1080, resampling them all by one pixel horizontally (which, if you saw the checkerboard, results in a whole lot of changes!)

Any idea what's the issue here? I checked an older Frodo Windows NUC system (on another television) and the same thing doesn't seem to be happening.
Same when you disable VAAPI and use Software Decoding?
(2014-12-08, 09:25)fritsch Wrote: [ -> ]Same when you disable VAAPI and use Software Decoding?

Just checked. Exact same issue in VAAPI, VAAPI w/ SW Filter, and Software decoding.
Just to get that right: The moment you change the Aspect ratio away from Original or away from Normal to 16:9 everything is fine?

Can you post a Debug Log, please?

Does it make a difference if you switch between the scalers? E.g. Lanczos3 Optimized vs. Bilinear / Nearest Neighbour? (I know that no scaling should be added at all, but those have different src, dest rectangles).
How often do you sit around watching checkerboards at night?
Hehe, this is not my main concern. My concern is wasted cycles on poor GPUs :-)
Well, thanks to the debug log I think I've figured this one out.

The television - a Sony XBR-49X850B - is reporting all of its 16x9 resolutions like this:

INFO: ID:0xb6 Name:1920x1080 Refresh:23.976080 Width:1920 Height:1080
INFO: Pixel Ratio: 1.000512

And if we take 1920 and divide by the oddball pixel ratio of 1.000512... we get 1919.017.

So, I'm thinking Kodi is just doing its job in calculating the resulting playback resolution. And the solution appears to be to change the "allow aspect ratio error to minimize black bars" option to 1%? Seems to work in my test, at any rate.
Thanks for reporting back.

@FernetMenta: Should we introduce some rounding? on the second position after decimal point to workaround Monitors that report their ratio in a wrong way? E.g. 1.000512 gets 1.0 but 1.0512 gets 1.05?

So: float rratio = ((int) ratio * 1000) / 1000.0) or something?
Here's the logs, at any rate. The TV supports a fairly wide range of resolutions. And as a user, I say I'd prefer to have an "error" of 1 or 2 pixels in order to eliminate needless scaling.

(2014-12-08, 11:06)fritsch Wrote: [ -> ]Thanks for reporting back.

@FernetMenta: Should we introduce some rounding? on the second position after decimal point to workaround Monitors that report their ratio in a wrong way? E.g. 1.000512 gets 1.0 but 1.0512 gets 1.05?

So: float rratio = ((int) ratio * 1000) / 1000.0) or something?

we need the precision for those monitors which report correct data. users can overwrite edid in case it is incorrect.
I think we need a sanity check nevertheless. As seen above that user is getting exactly one pixel wrong. E.g. use 1.0 as ratio when the scaled result is only 1px off? Most likely much better than using bilinear / nearest neighbour on the image.
The place to do this is this method: void CBaseRenderer::CalcNormalDisplayRect
What about this minimal approach:

diff --git a/xbmc/cores/VideoRenderers/BaseRenderer.cpp b/xbmc/cores/VideoRenderers/BaseRenderer.cpp
index 83c3adb..f56489e 100644
--- a/xbmc/cores/VideoRenderers/BaseRenderer.cpp
+++ b/xbmc/cores/VideoRenderers/BaseRenderer.cpp
@@ -423,7 +423,14 @@ void CBaseRenderer::CalcNormalDisplayRect(float offsetX, float offsetY, float sc

   // allow a certain error to maximize screen size
   float fCorrection = screenWidth / screenHeight / outputFrameRatio - 1.0f;
+  // allow 2 px aspect error always, cares for rounding artifacts, in case user did
+  // not overwrite
+  float pxerror     = std::max(2.0 / screenWidth, 2.0 / screenHeight);
   float fAllowed    = CSettings::Get().GetInt("videoplayer.errorinaspect") * 0.01f;
+  if (fAllowed == 0.0f)
+    fAllowed = pxerror;
   if(fCorrection >   fAllowed) fCorrection =   fAllowed;
   if(fCorrection < - fAllowed) fCorrection = - fAllowed;

When user did not set an error in aspect, we allow 2px max in either height or width?
that would be confusing and inconsistent:

user allows 0 means allowed error of 2
user allows 1 means allowed error of 1
user allows 2 means allowed error of 2

does this make sense? Smile

why not just setting default errorinaspect to 1 or 2?

EDIT: this is wrong anyhow because you would get an unwanted correction for all aspect ratios != screen
I think what we need here is: if calculated destination rect is just one pixel off from screen dimensions, use screen width/height.
Pages: 1 2