Kodi Community Forum

Full Version: [Linux] high cpu usage in every situation
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
Hi,

I'm running XBMC on a ubuntu / xubuntu htpc with an ION 330 inside since some time now. The specific version is:
Quote:# apt-cache show xbmc
Package: xbmc
[...]
Maintainer: Andres Mejia [...]
Architecture: all
Version: 2:10.00~svn35648-karmic1
Suggests: xbmc-third-parties
Depends: xbmc-data (= 2:10.00~svn35648-karmic1), xbmc-skin-confluence (= 2:10.00~svn35648-karmic1)
Filename: pool/main/x/xbmc/xbmc_10.00~svn35648-karmic1_all.deb

I configured it properly to use vdpau to render video and thus hd content is rendered by the gpu. But still xbmc uses quite a lot of cpu, probably to render the UI. I however wonder why on Windows Vista such problems don't exist? And as such I come to my question:

Is there a way I can configure XBMC to be rendered in hardware, rather than software?
well, if VDPAU is functioning, you should have the correct nvidia drivers installed, which should also provide OpenGL accelerated GUI rendering, thus eliminating the need for CPU power.

long story short, post a debug log as described in the sticky thread, or stay with the windows version.
Alright, here's the current xbmc.log. The windows version is unfortunatly no option because of several other reasons. While xbmc is idle I get this cpu usage:

Quote:$ ps aux | grep xbmc
tv 2352 0.0 0.0 4536 1584 ? Ss 13:00 0:00 /bin/bash /usr/bin/xbmc
tv 2394 38.1 2.9 230440 91432 ? Rl 13:00 97:15 /usr/share/xbmc/xbmc.bin

Which also affects the overall system load quite drastically:

Quote:$ uptime
18:24:07 up 5:24, 2 users, load average: 1.35, 1.34, 1.30

If you need any other information, feel free to ask. Huh I'm rather confused on this problem and would be glad of any ideas how I could fix it. Don't hesitate with complicated details - I'm university graduated in informatics.
In System->Sppearance->Skin, turn off navigation sounds and check if that makes a difference.
the navigation sounds are already turned off cause the nasty beep got on my nerves. Laugh
Alright, I decided to reinstall the nvidia drivers as that most probably causes a fallback of opengl to be calculated by the cpu, rather than the gpu.

I followed the wiki entry here but i most probably did worse the problem.

There I first installed the nvidia driver version 256, which did not work at all. Then I switched to the envyng installation and driver version 195. Unfortunatly my xserver won't start any longer and complains about "no screens found", "Undefined Screen X referenced by ServerLayout Y" but the screen configuration is there.

The strange thing is: I did not alter the xorg.conf configuration, neither did the installer do it. I've got the whole /etc under version control, that's why I know that. :-)

Any ideas? Huh
It seems I've got it back up working - at least partially.

What I did is:

* remove xbmc-live
* install gdm
* reboot
* remove nvidia driver 195
* install nvidia driver 256
* install xbmc-live (which removes gdm)
* edit /etc/X11/xorg.conf and add module "dri", which enables direct rendering

DONE

unfortunatly the cpu usage is still >30% .. so I just solved my problems I introduced myself. :-P
so my initial problem is still there and makes my head ache. I'm completely puzzled what could cause xbmc to use that much cpu power?
Alright, update after reboot .. that yippie was quite premature. Sad

The above did no longer work after I restarted the pc. At the point where xbmc-live would have started came up again that problem with "no screens found". So i reinstalled gdm and reverted back to my old configuration that autologins the user "tv", which then autostarts xbmc. Before it worked as apt-get removed gdm and started xbmc-live. Strangely on system start the same thing doesn't work and xserver crashes with the "no screens found" error.

And still, when I just switched in xbmc to "TV-Series" and the cpu usage explodes:
Quote:# ps aux | grep xbmc
tv 2695 0.0 0.0 1756 536 ? Ss 15:01 0:00 /bin/sh /usr/bin/xbmc
tv 2762 99.3 2.6 288476 82408 ? Rl 15:02 3:32 /usr/lib/xbmc/xbmc.bin
root 2955 0.0 0.0 3048 808 pts/0 S+ 15:05 0:00 grep xbmc

Is gdm strangely emulating a 3d rendering window for xbmc that is effectively rendered by the cpu?! Oo And why does xinit xbmc-standalone work, but on startup it tells me that there are no screens?

Well, when I run it with "xinit xbmc-standalone" it has still high cpu usage:

Quote:# ps aux | grep xbmc
xbmc 4517 0.0 0.0 2584 512 ? Ss 15:22 0:00 /bin/dbus-daemon --fork --print-pid 5 --print-address 7 --session
xbmc 4990 0.0 0.0 2584 512 ? Ss 15:28 0:00 /bin/dbus-daemon --fork --print-pid 5 --print-address 7 --session
tv 5047 0.0 0.0 3060 788 pts/0 S+ 15:29 0:00 xinit xbmc-standalone
tv 5056 0.0 0.1 11104 5644 pts/0 S 15:29 0:00 xterm -geometry +1+1 -n login xbmc-standalone
tv 5057 0.0 0.0 1756 520 pts/2 Ss+ 15:29 0:00 /bin/sh /usr/bin/xbmc-standalone
tv 5061 0.0 0.0 1756 532 pts/2 S+ 15:29 0:00 /bin/sh /usr/bin/xbmc --standalone ""
tv 5070 81.9 4.8 583320 149224 pts/2 Sl+ 15:29 8:13 /usr/lib/xbmc/xbmc.bin --standalone ""

I'm completely puzzled, HELP!
I to have high CPU udage. I thought that was just the way XBMC worked.

Code:
top - 12:57:40 up 1 day,  2:14,  2 users,  load average: 0.95, 0.87, 0.94
Tasks: 138 total,   1 running, 137 sleeping,   0 stopped,   0 zombie
Cpu(s):  8.0%us,  0.8%sy,  0.0%ni, 91.2%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   1543644k total,  1095344k used,   448300k free,    19344k buffers
Swap:  1333352k total,        0k used,  1333352k free,   887576k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND          
1366 gsg       20   0  309m  84m  30m S   32  5.6  87:58.23 xbmc.bin          
1980 gsg       20   0  8792 2212 1420 S    1  0.1   0:00.44 sshd              
1194 root      20   0 64304  51m 8952 S    1  3.4   0:23.44 Xorg              
1981 gsg       20   0 11932 5992 2420 S    1  0.4   0:00.51 xterm              
2080 gsg       20   0  2476 1168  876 R    0  0.1   0:00.05 top                
    1 root      20   0  2536 1512 1112 S    0  0.1   0:01.36 init
I have about 10% usage in the gui on an athlon X2 4000+, XBMC sits in a gameloop continuously rendering the gui even if nothing has changed, so that explains it.
is there no other way? i usually tend to implement an eventqueue and a dispatcher (just like wpf, winforms and many others do to save processing power), but i can imagine that migrating could be really big effort.

anyway, this game loop is quite exhausting for a HTPC. When an Athlon X2 4000+ already uses >10% of its CPU power just to run the game loop, I don't wonder that a dual core with 800MHz can't keep up with it.

there may be also the approach to keep the game loop, but fill the code with logical checks if certain code parts need to be processed. i.e. one has not to rewrite the framebuffer for a static menu item on every frame. naturally this would come as:

if(do_I_need_this) {
do_this
}

and would save power as long as the cpu cycles to calculate "do_I_need_this" are much less than the cycles to calculate "do_this".

Wink
I'm also running an Ion 330, experiencing the same issue. Sorry for the mere +1 post, yet i would definitely appreciate some advice on this as well..
Patches welcome.
The cpu usage was a bit too high (30% on one of the Atom core) when XBMC is idle on my Revo causing the fan to be heard-able. I changed the screensaver from dim to black and now cpu usage is almost null (1 or 2%).
I'm having exactly the same issue. I'm running XBMC on a fusion E 350 on archlinux which runs just fine however it has a CPU usage when IDLE of 100% on 1 core! Something is certainly wrong here.

When i look at GDB output i see this:
Quote:#0 0x00007ffaa05f5492 in malloc () from /lib/libc.so.6
#1 0x00007ffaa0bb367d in operator new(unsigned long) () from /usr/lib/libstdc++.so.6
#2 0x0000000000d821bd in dbiplus::callback(void*, int, char**, char**) ()
#3 0x00007ffaa479b511 in sqlite3_exec () from /usr/lib/libsqlite3.so.0
#4 0x0000000000d81778 in dbiplus::SqliteDatabase::exists() ()
#5 0x000000000086d412 in CDatabase::Open(DatabaseSettings&) ()
#6 0x00000000006764ba in CGUIInfoManager::GetLibraryBool(int) ()
#7 0x00000000006765f2 in CGUIInfoManager::GetLibraryBool(int) ()
#8 0x0000000000681b42 in CGUIInfoManager::GetBool(int, int, CGUIListItem const*) ()
#9 0x0000000000a9e38d in CGUIControl::UpdateVisibility(CGUIListItem const*) ()
#10 0x0000000000aafc2e in CGUIControlGroup::Render() ()
#11 0x0000000000a9f717 in CGUIControl:Big GrinoRender(unsigned int) ()
#12 0x0000000000aafc5b in CGUIControlGroup::Render() ()
#13 0x0000000000a9f717 in CGUIControl:Big GrinoRender(unsigned int) ()
#14 0x0000000000aafc5b in CGUIControlGroup::Render() ()
#15 0x0000000000af0fcd in CGUIWindow::Render() ()
#16 0x0000000000afa983 in CGUIWindowManager::Render() ()
#17 0x000000000072e81b in CApplication::RenderNoPresent() ()
#18 0x0000000000722c69 in CApplication::Render() ()
#19 0x00000000009754e1 in CXBApplicationEx::Run() ()
#20 0x0000000000975b10 in main ()

So my guess is a rendering loop that takes 100% cpu load.
I'm gonna try the screensaver suggestion to see if that works.
Pages: 1 2