[RELEASE] Texture Cache Maintenance utility
(2014-04-29, 17:41)rbusenet Wrote: I got that working but was wondering how you have your's setup.

I have a NAS (FreeNAS, exporting NFS, running MySQL), a headless Raspbian-based Raspberry Pi(*) on which I run Sickbeard/Transmission and other scheduled tasks including all the scripts to handle XBMC scraping, an XBMC "scraper" client (OpenELEC Raspberry Pi, this is the "scraper" as it's always "on") and an XBMC non-scraper client (OpenELEC x86).

*This could have been the NAS instead of a Raspbian Raspberry Pi, but I prefer to keep the NAS "as a NAS" as much as possible and I'd have put MySQL on the Pi too if it had the grunt!

This is pretty much my approach:

TV Shows:
These all come via Sickbeard running on a Raspbian Raspberry Pi. I have a Sickbeard post-processing script that is called by Sickbeard whenever it has copied a new episode to my NAS, and this post processing script is passed the file name of the new episode media file (mkv, avi etc.). The post-processing script uses mediainfo to add streamdetails to the corresponding NFO for the new media file, and then the following actions are performed:
  1. a library scan is performed on the "scraper" client (Raspberry Pi running OpenELEC). Only the season folder is scanned (vscan "nfs://path/to/tvshow/Season XX"), not the whole library
  2. mklocal.py is used to convert any remote artwork to local (unlikely there would be any with my tv show setup, but just in case...) and also load any new non-standard local artwork (eg. clearlogo, clearart, etc.)
  3. pre-load texture cache (c tvshows) on the scraper client if any artwork is converted or new artwork loaded by mklocal.py
  4. runs "qax tvshows @qaperiod=90" to reload any episodes that fail qa (eg. when episode thumb wasn't available yesterday but now it's been downloaded)
  5. wakes any sleeping remote non-scraper clients (if sleeping), runs "c tvshows" (or "lc tvshows" if mklocal.py made no changes) against each non-scraper client then puts the client back to sleep if originally sleeping

Movies:
These I rip myself, copy to the NAS along with artwork, then I create the NFO file using Ember. A daemon (started at boot, running on the Raspbian Pi, wakes at 5 minute intervals) uses "find" to locate any new movies, and only when all of the new movies have a matching NFO then the following actions are performed:
  1. a library scan is run on the scraper client, performing a whole library scan (vscan - yes, this could inadvertently pick up a new tv show episode but given the infrequency with which movies are added this isn't usually a problem)
  2. after the scan, Artwork Downloader is executed (exec script.artwork.downloader silent=true mediatype=movie). As I store multiple movies per folder (ie. I use the movie-name prefix rather than one movie per folder) "Use local files" is not enabled and new remote artwork is associated with movies. The script waits for AD to finish by tailing xbmc.log. AD is configured to find only clearart and clearlogo artwork on http://fanart.tv.
  3. mklocal.py is used to convert any remote artwork to local (ie. any new artwork found by AD), and also load any other new non-standard local artwork (unlikely there would be any, but just in case...)
  4. pre-load texture cache ("c movies") on the scraper client if any artwork is converted or new artwork loaded by mklocal.py
  5. wakes any sleeping remote non-scraper clients (if sleeping), runs "c movies" (or "lc movies" if mklocal.py made no changes) against each non-scraper client then puts the client back to sleep if originally sleeping

So my workflow for TV shows is... precisely nothing, everything works automatically - shows are loaded automatically as they become available, artwork is updated and backfilled, all caches are pre-loaded.

For movies, I copy the new movie files(s) and associated fanart/poster artwork to my NAS after ripping, scrape each new movie into Ember which creates the NFO file (with streamdetails etc.) on the NAS, and once all the movies have their NFO files, within 5 minutes the scraping process will have started automatically on the "scraper" client. Once the database optimisations have landed I will consider adding a call to update imdb movie rating/vote information on a more regular basis - the current JSON/database update performance doesn't really make this feasible right now.

(2014-04-29, 17:41)rbusenet Wrote: I can see that a cron job isn't exactly the most efficient setup for this script. Ideally the scraper xbmc should trigger the client xbmc to start the script when something new has been scrapped.

There is an addon, I think - I can't recall the name[1][2] - that allows you to execute external shell scripts. One solution might be to call a shell script from within XBMC (you'll probably want to fork a new process within the shell script, as it can take a while to finish). The shell script can then perform all or some of the above steps - that is, the shell script initiates the library scan and performs all the other steps, not XBMC starting the scan and then somehow calling the script to complete the remaining steps.

Maybe the following addons will be of use - not sure if they're Gotham compatible:
1: http://forum.xbmc.org/showthread.php?tid=85724
2: http://forum.xbmc.org/showthread.php?tid=151011
Texture Cache Maintenance Utility: Preload your texture cache for optimal UI performance. Remotely manage media libraries. Purge unused artwork to free up space. Find missing media. Configurable QA check to highlight metadata issues. Aid in diagnosis of library and cache related problems.
Reply


Messages In This Thread
RE: [RELEASE] Texture Cache Maintenance utility - by Milhouse - 2014-04-29, 20:42
Crash on Gotham on OS X - by desepticon - 2014-05-29, 17:57
Cleaning - by AleisterHH - 2018-05-28, 22:03
RE: Cleaning - by Milhouse - 2018-05-28, 22:16
qax genre not updated - by Just-Me_A-User - 2018-06-12, 22:06
RE: qax genre not updated - by Milhouse - 2018-06-12, 23:40
Logout Mark Read Team Forum Stats Members Help
[RELEASE] Texture Cache Maintenance utility17