Kodi Community Forum

Full Version: [RELEASE] Aeon Nox 2.0 (deprecated)
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
@Choque: Thanks for the good idea. I probed the colorspace capabilities of the Intel driver and got "RGB 4:4:4" and "YCrCb 4:4:4". Promising. Unfortunately, forcing YcrCb444 via xorg.conf made zero visible difference. It's possible that it's not properly implemented in the Intel driver, and is either not working at all, or not doing any compression (not rescaling so that 0 becomes 16, and so on). I'll talk to the Intel guy in a few weeks. Thanks anyway for the good idea.

@SpectreX: Yeah, nVidia is at the top of the game on Linux; they support fractional framerates for judder-free playback, multichannel HDMI audio output, and 16-235 output. They also have less bugs. However, I'll see what I can do with my contact at Intel, because adding a standalone graphics card is not an option (they *really* draw loads of electricity**). I already got him working on fixing the multichannel HDMI audio that's built into the Sandy Bridge motherboards, so that Linux users will soon be able to enjoy uncompressed multichannel audio over the same wire as the video. Currently, there's a bug in its EDID->ELD parsing which means it doesn't expose the capabilities, and defaults to exposing a "2 speaker setup". This means Intel users only see "Digital Stereo (HDMI) Output", and has forced all users to find other output solutions for their multichannel audio. When this is fixed, it will properly enumerate and find all speakers and support up to 8 channel audio over the HDMI port. I'll see if he can add 16-235 output as well, that should be easy.

** An -idle- nVidia card draws about 3x more electricity than my entire system combined (that means more power than the CPU, all four hard disks, the motherboard, all fans, etc, all added together). Under load it draws about 4.5x more than my system's power draw. Therefore, for low-electricity 24/7 HTPCs, it doesn't make sense to go beyond the built-in Intel HD2000/3000. After the serious bugs are worked out, Intel will be a great choice. The only thing we probably will not be able to add is fractional framerate output, due to hardware limitations that cannot be overcome. However, that will be the last remaining benefit that nVidia will offer over the integrated graphics (well, apart from better graphics performance, but that's not needed for an HTPC, and the HD3000 is plenty powerful for that too anyway).
Isn't calibrating supposed to correct everything, no matter what the outputted colourspace is? So even if the output is wrong, 0-255, you correct it by calibrating it to 16-235.
Btw I'm running linux on my htpc and I do recall having to recalibrate when moving from windows.
@SpectreX: Oh and yes, the projector I use is for 16-235 input and not 0-255 (which is a normal decision for video display devices, since *all* movies, videos, consoles, etc are 16-235, aka "video levels", and the *only* thing that uses the extended "pc levels" range is computers). But basically the EDID informs the graphics card about supported ranges (0-255, 16-235, or both), and your Windows driver will auto-select the appropriate one. In my case, the problem occurs because my Projector requires 16-235 input and the Intel linux driver only does 0-255. Yet another oversight in the Intel driver.

@Big_Noid: It's not possible to reclaim lost shadow detail/crushed blacks by raising brightness on the projector, since stuff that's below 16/over 235 is already clipped and scaled during the input stage. One solution would be to apply global "gamma" in XBMC to raise the overall brightness, but that's not supported for the GUI, only for the video player, and only by doing some special magic by writing some extra code into the OpenGL rendering shader to re-scale the video layer to 16-235. That is precisely what some people do, though.

However, the proper solution is to do what nVidia did, and allow a 16-235 output mode so that everything on the screen will be properly re-adjusted to the right output range by the graphics card. It should be fairly easy to add to the driver (it already exists in the drivers for the older pre-HD2000/HD3000 range), and it'd be a boon to HTPC owners. Currently, the conventional wisdom is to stick with nVidia for Linux HTPCs, but we're working through the most serious bugs, and Intel is about to catch up and fix its biggest shortcomings pretty soon. Smile That's great news for HTPC owners, due to the *far* lower power draw and nonexistent fan noise of integrated graphics.
hey quick question... i'm *loving* this theme by the way. is there a way to increase the plot font size in the boxes under movies and tv? it's a little small for my bad eyesight.

thanks! i did a search and came up with nada.
Think I have found a bug. Maybe it belongs to Aeon Nox or maybe to XBMC itself.

The problem occurs after I have used "video calibration" to exactly match the screen to the chosen resolution.

During video playback, if I rewind or ff the video, the seek bar both at the bottom and the top barely shows, because they are almost outside screen.....

Can I adjust this somehow? I haven't found an easy way to do it, because it doesn't follow the subtitle adjustments or the general screen boundary adjustments.....
Charlie97L Wrote:hey quick question... i'm *loving* this theme by the way. is there a way to increase the plot font size in the boxes under movies and tv? it's a little small for my bad eyesight.

thanks! i did a search and came up with nada.

There's no in-skin configuration of font sizes, so you will have to edit the skin files. I'm afraid there is no other solution. It's entirely doable but requires some tech skills. Here are some hints for anyone that wants to attempt this:

I am not sure what you mean by "the boxes under movies and tv". Do you mean the "last added movies/tv episodes" boxes that you can enable in the Skin configuration, that show a list of newly added/random movies and episodes when you hover over your Movies and TV Shows main menu entries?

Since I don't know which boxes you mean, I can't help with the exact view it relates to, but the view definitions are under your XBMC user data folder > addons > skin.aeon.nox > 720p. That folder contains the XML files for every UI element.

Those UI files, in turn, contain references to fonts. You'll see lines such as "<font>Font_Reg15</font>" littered throughout those files. The valid font names are all defined in Font.xml.

So: Determine what "boxes" you're referring to, find the relevant viewfile for them (if it's one of the information boxes I described on the Home screen, you will find what you seek in Includes_HomeWidgets.xml, I think under the section called "LatestPosterWidget"), look for <font> tags in the relevant section of the file, and change the appropriate ones (the title of each movie/episode entry, for instance) to a bigger size. Make sure that the new font name/size you choose is valid (it has to exist in Font.xml). Then restart XBMC to see the change.

There's one final caution: Choosing too big of a font size will mess up the design, but of course you can easily step down your changes a bit.
When I enable "Show video info when player is paused" I get no OSD when I pause the video.
Big Noid:

Here's that debug log.

Platform: ATV2
Version: 4.3
install method for XBMC - SSH'd using instructions from this post
Build Version: XBMC Pre-11 Git 20110515-8b3a9fc


Have reinstalled the nightly several times and have been unable to get your skin loaded. I keep receiving the error message "dependencies not met" when attempting to install from Zip File.

I don't know how to embed a debug file , but I think I found the info you need. . .

Quote:#
13:07:09 T:185208832 M:122707968 DEBUG: GetZipList - Processing zip://%2fvar%2fmobile%2fBigNoid%2dAeon%2dNox%2d0ac9acc%2ezip/
#
13:07:13 T:185208832 M:122318848 DEBUG: Addon skin.aeon.nox requires xbmc.gui version 3.0.0 which is not available
#
13:07:13 T:185208832 M:122318848 INFO: Loading skin file: DialogKaiToast.xml
#
13:07:13 T:185208832 M:122318848 DEBUG: ------ Window Init (DialogKaiToast.xml) ------
#
13:07:13 T:185208832 M:122318848 INFO: Texture bundle has changed, reloading
#
13:07:13 T:185208832 M:122318848 DEBUG: Cleanup - Closed bundle
#
13:07:13 T:185208832 M:122318848 INFO: Texture bundle has changed, reloading
#
13:07:13 T:185208832 M:122318848 DEBUG: Cleanup - Closed bundle
#
13:07:13 T:185208832 M:122318848 ERROR: Texture manager unable to open file /var/mobile/BigNoid-Aeon-Nox-0ac9acc.zip/icon.png
AeonNoxFan Wrote:** An -idle- nVidia card draws about 3x more electricity than my entire system combined (that means more power than the CPU, all four hard disks, the motherboard, all fans, etc, all added together). Under load it draws about 4.5x more than my system's power draw. Therefore, for low-electricity 24/7 HTPCs, it doesn't make sense to go beyond the built-in Intel HD2000/3000. After the serious bugs are worked out, Intel will be a great choice. The only thing we probably will not be able to add is fractional framerate output, due to hardware limitations that cannot be overcome. However, that will be the last remaining benefit that nVidia will offer over the integrated graphics (well, apart from better graphics performance, but that's not needed for an HTPC, and the HD3000 is plenty powerful for that too anyway).

I have an nvidea based ion machine that draws less than 200w total and cost under £200. There's really no need to get a standard nvidea card as this does the job more than well enough for a dedicated HTPC.

Incidentally I use intel graphics in my laptop and ATI at work and Nox looks fine on all of them to my eyes! :p
damned Wrote:Big Noid:

Here's that debug log.

Platform: ATV2
Version: 4.3
install method for XBMC - SSH'd using instructions from this post
Build Version: XBMC Pre-11 Git 20110515-8b3a9fc


Have reinstalled the nightly several times and have been unable to get your skin loaded. I keep receiving the error message "dependencies not met" when attempting to install from Zip File.

I don't know how to embed a debug file , but I think I found the info you need. . .
I think the problem is that git names the foldes differently. Try the following. Unpack the zip first on your pc. rename the folder to skin.aeon.nox and repack it to skin.aeon.nox.zip. SSH to atv and install from zip. That should work.
drivesoslow Wrote:When I enable "Show video info when player is paused" I get no OSD when I pause the video.

Yeah, is only while playing video, so the next video you play (or stop start the same) it will be good. But I'll fix that, thx.
Big_Noid Wrote:Are you talking about pseudotv?

No, I will illustrate it:


Image

Direct link: http://imageshack.us/photo/my-images/854/aeonnox.png/

Image

Direct link: http://imageshack.us/photo/my-images/546...uence.png/

As you see, in Customizable Confluence the TV/Studio logo shows, this is because I made my own and added to the folder in the skin addon folder. This is the path on a windows machine

%APPDATA%\XBMC\addons\skin.customizableconfluence\media\flagging\studios

So I'm asking is the an equal option in your skin to add homemade studiologos or where do the logos get fetch from?
@Vinther: Yes, the path is skin.aeon.nox/media/flags/studios
If you create these folders next to the textures.xbt the skin will pick them up.
Please consider adding your studio flags in this thread, so more skins can use them:
http://forum.xbmc.org/showthread.php?tid=99554
Mindzai Wrote:I have an nvidea based ion machine that draws less than 200w total and cost under £200. There's really no need to get a standard nvidea card as this does the job more than well enough for a dedicated HTPC.

Incidentally I use intel graphics in my laptop and ATI at work and Nox looks fine on all of them to my eyes! :p

I hope 200W was a typo, and you meant 20W....Big Grin
SpectreX Wrote:I hope 200W was a typo, and you meant 20W....Big Grin

Yes my mistake it's actually got a 65w PSU so 20w is probably closer to the truth!