Kodi Community Forum

Full Version: [MOVED] Problems starting XBMC fullscreen
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7 8 9 10
The options are important too. The modelines, well tbh i cheated and just downloaded some - they likely do not produce pixel-perfect images but i dont care.

My scaler can give me the information needed to write the modelines, by adjusting suitable test-images until everything is perfect, a process known as calibration, for which people pay alot of money.
The xorg.conf options look mostely the same as mine - except for the modeline stuff. I'll have a look after this. Meanwhile some lines from Xorg.0.log:

PHP Code:
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.01.01.0)
(**) 
NVIDIA(0): Option "NoFlip" "True"
(**) NVIDIA(0): Option "NoLogo" "True"
(**) NVIDIA(0): Option "NvAGP" "1"
(**) NVIDIA(0): Option "RenderAccel" "True"
(**) NVIDIA(0): Option "TwinView" "False"
(**) NVIDIA(0): Option "ExactModeTimingsDVI" "True"
(**) NVIDIA(0): Option "Coolbits" "1"
(**) NVIDIA(0): Option "TripleBuffer" "True"
(**) NVIDIA(0): Option "UseEvents" "True"
(**) NVIDIA(0): Option "AddARGBGLXVisuals" "True"
(**) NVIDIA(0): Option "DamageEvents" "True"
(**) NVIDIA(0): Option "FlatPanelProperties" "Scaling = Native"
(**) NVIDIA(0): Option "DynamicTwinView" "False"
(**) NVIDIA(0): Option "PixmapCacheSize" "1000000"
(**) NVIDIA(0): Option "AllowSHMPixmaps" "0"
(**) NVIDIA(0): OpenGL flipping disabled
(**) NVIDIA(0): Enabling RENDER acceleration
(**) NVIDIA(0): Use of NVIDIA internal AGP requested
(IINVIDIA(0): NVIDIA GPU GeForce 8300 (C77at PCI:2:0:(GPU-0)
(--) 
NVIDIA(0): Memory524288 kBytes
(--) NVIDIA(0): VideoBIOS62.77.2f.00.00
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(son GeForce 8300 at PCI:2:0:0:
(--) 
NVIDIA(0):     SONY TV XV (DFP-0)
(--) 
NVIDIA(0): SONY TV XV (DFP-0): 165.0 MHz maximum pixel clock
(--) NVIDIA(0): SONY TV XV (DFP-0): Internal Single Link TMDS
(WWNVIDIA(0): The EDID for SONY TV XV (DFP-0contradicts itselfmode
(WWNVIDIA(0):     "1920x1080" is specified in the EDIDhoweverthe EDID's
(WW) NVIDIA(0):     valid VertRefresh range (48.000-62.000 Hz) would exclude
(WW) NVIDIA(0):     this mode'
s VertRefresh (24.0 Hz); ignoring VertRefresh
(WWNVIDIA(0):     check for mode "1920x1080".
(
WWNVIDIA(0): The EDID for SONY TV XV (DFP-0contradicts itselfmode
(WWNVIDIA(0):     "1920x1080" is specified in the EDIDhoweverthe EDID's
(WW) NVIDIA(0):     valid VertRefresh range (48.000-62.000 Hz) would exclude
(WW) NVIDIA(0):     this mode'
s VertRefresh (24.0 Hz); ignoring VertRefresh
(WWNVIDIA(0):     check for mode "1920x1080".
(
IINVIDIA(0): Assigned Display DeviceDFP-0
(==) NVIDIA(0): 
(==) 
NVIDIA(0): No modes were requestedthe default mode "nvidia-auto-select"
(==) NVIDIA(0):     will be used as the requested mode.
(==) 
NVIDIA(0): 

As you can see, the driver even complains about incompatible edid information on the tv set. i think this perfectly fits the issue everybody's seeing.
Hehe - seems like it is as easy as this:
Code:
haggy@aereogramme /var/tmp $ gtf 1920 1080 50

  # 1920x1080 @ 50.00 Hz (GTF) hsync: 55.60 kHz; pclk: 141.45 MHz
  Modeline "1920x1080_50.00"  141.45  1920 2032 2232 2544  1080 1081 1084 1112  -HSync +Vsync

or to try this at home:

Code:
gtf <horizontal_res> <vertical_res> <refresh_rate>
For posterity, here's mine...

PHP Code:
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.01.01.0)
(**) 
NVIDIA(0): Option "ExactModeTimingsDVI" "True"
(**) NVIDIA(0): Option "UseDisplayDevice" "DFP-0"
(**) NVIDIA(0): Option "AddARGBGLXVisuals" "True"
(**) NVIDIA(0): Option "ModeValidation" "DFP-0: NoMaxSizeCheck, NoHorizSyncCheck, NoVertRefreshCheck, AllowNon60HzDFPModes, NoMaxPClkCheck, NoVesaModes, NoXServerModes, NoPredefinedModes"
(**) NVIDIA(0): Option "FlatPanelProperties" "Scaling = Native"
(**) NVIDIA(0): Option "DynamicTwinView" "false"
(**) NVIDIA(0): Enabling RENDER acceleration
(IINVIDIA(0): NVIDIA GPU GeForce 9400 GT (G96at PCI:1:0:(GPU-0)
(--) 
NVIDIA(0): Memory524288 kBytes
(--) NVIDIA(0): VideoBIOS62.94.4b.00.77
(IINVIDIA(0): Detected PCI Express Link width16X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(son GeForce 9400 GT at PCI:1:0:0:
(--) 
NVIDIA(0):     Sony SDM-P234 (DFP-0)
(--) 
NVIDIA(0): Sony SDM-P234 (DFP-0): 330.0 MHz maximum pixel clock
(--) NVIDIA(0): Sony SDM-P234 (DFP-0): Internal Dual Link TMDS
(IINVIDIA(0): Mode Validation Overrides for Sony SDM-P234 (DFP-0):
(
IINVIDIA(0):     AllowNon60HzDFPModes
(IINVIDIA(0):     NoMaxPClkCheck
(IINVIDIA(0):     NoMaxSizeCheck
(IINVIDIA(0):     NoHorizSyncCheck
(IINVIDIA(0):     NoVertRefreshCheck
(IINVIDIA(0):     NoVesaModes
(IINVIDIA(0):     NoXServerModes
(IINVIDIA(0):     NoPredefinedModes
(IINVIDIA(0): Assigned Display DeviceDFP-0
(==) NVIDIA(0): 
(==) 
NVIDIA(0): No modes were requestedthe default mode "nvidia-auto-select"
(==) NVIDIA(0):     will be used as the requested mode.
(==) 
NVIDIA(0): 
(
IINVIDIA(0): Validated modes:
(
IINVIDIA(0):     "nvidia-auto-select"
(IINVIDIA(0): Virtual screen size determined to be 1920 x 1200
(--) NVIDIA(0): DPI set to (9798); computed from "UseEdidDpi" X config
(--) NVIDIA(0):     option
(WWNVIDIA(0): 32-bit ARGB GLX visuals require the Composite extension.
(
WWNVIDIA(0): Disabling 32-bit ARGB GLX visuals
Haggy Wrote:Hehe - seems like it is as easy as this:
Code:
haggy@aereogramme /var/tmp $ gtf 1920 1080 50

  # 1920x1080 @ 50.00 Hz (GTF) hsync: 55.60 kHz; pclk: 141.45 MHz
  Modeline "1920x1080_50.00"  141.45  1920 2032 2232 2544  1080 1081 1084 1112  -HSync +Vsync

or to try this at home:

Code:
gtf <horizontal_res> <vertical_res> <refresh_rate>

Learn something everyday! Thanks Smile
In nearly ten years of linux i NEVER stumbled upon this handy tool...
Would you mind posting some Xorg.log lines from your nvidia 8200 machine as this also seems to make a difference. Most of the issues seem to come from 8200/8300 GPUs.
motd2k Wrote:Learn something everyday! Thanks Smile

Thanks for the tips - how do you launch XBMC? I am currently using my .xsession to do it AND am running --standalone. I think that this is not appropriate for using modelines right? How do you autostart XBMC at boot time?

Thanks,
xnappo
xnappo,

i'm unsure whether it will work in the way you describe or not. I simply add an entry to System/Prefs/Sessions
Up to now i start via inittab and xinitrc:

/etc/inittab:
Code:
x:5:once:/bin/su haggy -l -c "/bin/bash --login -c startx >/dev/null 2>&1"

and .xinitrc:
Code:
exec nvidia-settings --load-config-only &
exec ck-launch-session xbmc -fs --standalone

but i think i have to change this using modelines - at least --standalone has to go i think.
Unfortunately i dont think gtf is reliable - its given me the exact same modeline for both my Sony monitor and 42" TV.

Better than nothing though, but I suspect it wont give proper 1:1 pixel mapping and whatnot.
This is from my 8200...

PHP Code:
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.01.01.0)
(**) 
NVIDIA(0): Option "ExactModeTimingsDVI" "True"
(**) NVIDIA(0): Option "UseDisplayDevice" "DFP-0"
(**) NVIDIA(0): Option "AddARGBGLXVisuals" "True"
(**) NVIDIA(0): Option "ModeValidation" "DFP-0: NoMaxSizeCheck, NoHorizSyncCheck, NoVertRefreshCheck, AllowNon60HzDFPModes, NoMaxPClkCheck, NoVesaModes, NoXServerModes, NoPredefinedModes"
(**) NVIDIA(0): Option "FlatPanelProperties" "Scaling = Native"
(**) NVIDIA(0): Option "DynamicTwinView" "false"
(**) NVIDIA(0): Enabling RENDER acceleration
(IINVIDIA(0): NVIDIA GPU GeForce 8200 (C77at PCI:2:0:(GPU-0)
(--) 
NVIDIA(0): Memory524288 kBytes
(--) NVIDIA(0): VideoBIOS62.77.24.00.08
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(son GeForce 8200 at PCI:2:0:0:
(--) 
NVIDIA(0):     OEM 37'' TFT-TV (DFP-0)
(--) 
NVIDIA(0): OEM 37'' TFT-TV (DFP-0): 165.0 MHz maximum pixel clock
(--) NVIDIA(0): OEM 37'' TFT-TV (DFP-0): Internal Single Link TMDS
(IINVIDIA(0): Mode Validation Overrides for OEM 37'' TFT-TV (DFP-0):
(
IINVIDIA(0):     AllowNon60HzDFPModes
(IINVIDIA(0):     NoMaxPClkCheck
(IINVIDIA(0):     NoMaxSizeCheck
(IINVIDIA(0):     NoHorizSyncCheck
(IINVIDIA(0):     NoVertRefreshCheck
(IINVIDIA(0):     NoVesaModes
(IINVIDIA(0):     NoXServerModes
(IINVIDIA(0):     NoPredefinedModes
(IINVIDIA(0): Assigned Display DeviceDFP-0
(==) NVIDIA(0): 
(==) 
NVIDIA(0): No modes were requestedthe default mode "nvidia-auto-select"
(==) NVIDIA(0):     will be used as the requested mode.
(==) 
NVIDIA(0): 
(
IINVIDIA(0): Validated modes:
(
IINVIDIA(0):     "nvidia-auto-select"
(IINVIDIA(0): Virtual screen size determined to be 1920 x 1080
(--) NVIDIA(0): DPI set to (304304); computed from "UseEdidDpi" X config
(--) NVIDIA(0):     option
(WWNVIDIA(0): 32-bit ARGB GLX visuals require the Composite extension.
(
WWNVIDIA(0): Disabling 32-bit ARGB GLX visuals.
(--) 
Depth 24 pixmap format is 32 bpp
(II) do I need RAC?  NoI don't. 
motd2k Wrote:xnappo,

i'm unsure whether it will work in the way you describe or not. I simply add an entry to System/Prefs/Sessions

So your xbmc command is simply 'xbmc.bin'. I am using '/usr/share/xbmc/xbmc.bin -q --standalone' . I don't have Gnome or anything installed - just using the minimum install - so I think in that case to match your setup I need to change my .xsession to start X, and then launch xbmc without --standalone..?

Thanks,
xnappo
what does '-q' mean?
Haggy Wrote:what does '-q' mean?

I have no idea, I just followed this guide:
http://wiki.xbmc.org/?title=HOW-TO:_Inst...ep-by-step

I also noticed this line in motd2k's .xorg:
Option "ModeValidation" "DFP-0: NoMaxSizeCheck, NoHorizSyncCheck, NoVertRefreshCheck, AllowNon60HzDFPModes, NoMaxPClkCheck, NoVesaModes, NoXServerModes, NoPredefinedModes"

I think that might be important.

xnappo
motd2k Wrote:Unfortunately i dont think gtf is reliable - its given me the exact same modeline for both my Sony monitor and 42" TV.

I don't think gtf takes the actual screen into count as it is simply a nicer interface to a calculator :-)

every screen should handle those computed modelines well or am i wrong?
Pages: 1 2 3 4 5 6 7 8 9 10