Attention: Thumbnail cache rewrite's unintended consequences
#16
we do agree that the static list is suboptimal. we do plan to address it. it's just not part of this iteration. we have gsoc's going on that will rewrite the scraper system. there's no sense in wasting work on doing this within the current system, when you have to redo it again in half a year's time.
Reply
#17
@spiff: Yeah the image URL list being static is just one of the issues though; the bigger issue is how local cache files are stored using a flawed naming scheme, and that's an improvement that can be done now, before history repeats itself and this happens again (i.e. when themoviedb moves to a different CDN URL format and suddenly all crc32(url) mappings are once again broken).

Anyway, could someone please briefly explain in which database the per-video/per-movie/per-series choices for thumbnail and fan art are stored? I thought it was in MyVideos.db but I'm looking at the c08 and c20 columns and see huge lists. Seems that's not where the choice was stored.

Perhaps it was stored in Textures.db? I'm looking for the place where the active image choices are stored.

Maybe Martijn could tell me how it's laid out?

I am in the middle of writing a python script that scans through the appropriate db looking for invalid URLs and if so, it'll look up the old hashed file, copy it to a new temp folder i.e. "American History X (Fanart).jpg" and that way you could run it on your database and have it rescue all the thumbnails and fan art etc for *just* the library items that need rescuing and give you a human-readable result. I don't know how far I'll get. Ask me to code anything in Assembler and I'll do it, but this is my first time touching Python. I'm experienced in most other big scripting languages but never had a reason to use the snake. (That's what she said)

Edit: Okay so far my own testing of the old scheme has revealed that the thumbnail comes from crcing of the path such as /storage/media/Moviename/. Now I just need to know how Fanart is determined. I can see that the NEW format stores the fan art in the "art" table in Textures, but where did the PREVIOUS database format store it?
Reply
#18
http://wiki.xbmc.org/index.php?title=XBMC_databases
http://wiki.xbmc.org/index.php?title=ThumbnailCache
http://wiki.xbmc.org/index.php?title=Thumbnails

Neither actually tells me where the *selected fan art* choice was stored in the old database, and I sure can't find it. All I've managed to find are the XML dumps containing the full lists of available thumbs/fan art online.

Maybe fan art was stored as a linked value between MyVideos.db and Textures.db?

Gotta put the scripting on hold until I know... Heck, Martijn, you know all of this and Python; if you want to give it a go, the idea I had was:

* Load old (previous format) database.
* Loop through tv shows and movies.
* Do HEAD requests for every selected thumb and fan art URL to see if the file still exists
* If the file does not exist on the server anymore, take the LOCALLY CACHED OLD-STYLE file and COPY it to a temp folder with a nice name like "American History X (Thumb).tbn" and do that for every movie and tv show in the database. It will result in an output folder of all the orphaned files, which can then be casually gone through to re-instate the art either via local file browsing or via the latest online URLs.
Reply
#19
1. themoviedb and thetvdb.com urls are pretty stable now according to their owners, but...

2. we don't care if they're not, as we never requery http:// urls anyway once cached (unlike local sources, and ofcourse local source URLs are much more stable - they have to be otherwise the media is going to disappear anyway).

3. the hash is used only for the local filename, for nothing else whatsoever. It's defined entirely at the texturecache level, and I could have used incrementing numbers, random numbers or whatever I wanted to, as it's stored in the database. Yes, the current method assumes that image from URL x differs from the image from URL y **. This is almost certainly the case anyway, but even if it wasn't, all it gives is duplicate images in the cache, so it really doesn't matter (it's much more likely that different media URLs would use the same art URLs, and with usage tracking, we'll quickly discover old art that is no longer used and clean it up). We could certainly generate a hash on the file as it's being cached and use that to redirect as needed if we're concerned about dupes in the cache - I haven't yet seen a need for that.

4. we store the original URL in the videodb as that way multiple clients can cache from the same database. Storing a hash here doesn't make sense, as the texture cache is local, so cannot be queried. At least with the original URL we have a hope of multiple clients being able to get the image.

5. given that the existing static URLs in database are wrong, the only reasonable fix here is to update them first and foremost, and secondly, to replace with a URL that is always correct (i.e. a method of finding the correct URLs for art regardless of updates to the sites in question - essentially looking up based on id for example). The latter has not yet been done - I doubt it'll be done by the time frodo is out, but even if it was, it may require a rescrape, as I don't think we have the required ID stored for this to work well anyway (though we may be able to do so if we make enough assumptions). The plan is to ensure we have that ID available by the time Frodo is out so that further rescraping will be minimised. Thus, rescraping now seems reasonable enough where it's required - the only way to fix bad data is to fetch good data. Indeed, we don't actually require a full rescrape, as Martijn is planning to take care of it with a script.

6. If you wish to know the original hashing technique, see the commit series that removed it - in particular the history of ThumbnailCache.cpp. For movie posters and fanart you have things pretty much correct (lower case of path). For other things there are more interesting hashing behaviours. See season art for example. Remember that this does not solve the problem - all it does is give you art but doesn't fix the URLs in the database. It might work out fine for you.

7. Note that if all you want to do is export your old art and then find it again, you have a number of simple ways to do this as long as you have an Eden install. For example, you could export your library to separate files (art alongside media) and then rescan with Frodo. Or, you could export from Eden to a single file/folder structure, and then you have all your art exported to find again should you need to with Frodo. Neither fixes the issue with bad data in the database, and the former makes that bad data persist by dropping it into .nfo files.

8. For fanart, the selected fanart is the topmost, assuming it came from online. If the user has since changed it to a local image, we have no knowledge of this in the old db. For thumbs, we have no knowledge whatsoever where the art came from. Assuming non-local, the best guess is it's the first one, but if the user has changed it to another online thumb we have no knowledge of that.

Cheers,
Jonathan

** We also happen to store a hash for local images (computed using ctime/mtime + image size for speed) but ofcourse that isn't unique enough - we use it only for determining whether a local image may have changed, and if so, to update it automatically.
Always read the XBMC online-manual, FAQ and search the forum before posting.
Do not e-mail XBMC-Team members directly asking for support. Read/follow the forum rules.
For troubleshooting and bug reporting please make sure you read this first.


Image
Reply
#20
Thank you so much for the detailed explanation, Jonathan. I can see why the system was designed this way, but am glad to hear you're looking into various ways of improving it so that this situation is not repeated. There really should be some security that prevents "dead URL = local file no longer valid" from ever being even a possibility (remote URL shouldn't matter once data is safely on the disk). Then again, maybe this won't repeat itself in the future once people are fully migrated to the new Texture database format; since even if a URL dies, it'll still contain a cached URL = local file reference and continue using the same image.

I am especially grateful for pointing me to the right source files. It allowed me to understand the old hashing method and I've hastily written a small export tool (didn't want to bother re-installing an older XBMC version and use its export since that, afaik, doesn't give human-readable .tbn files either).

It was one of those "okay I don't know Python and I don't really care about doing this perfectly, but I'm constructing this as I go" things, but it works. It queries the latest database for all movies, grabbing their path and filename data, applies the old hashing logic (supporting both single-file and stacked-file movies) and copies the cover+fanart (if existing) to nicely named output files under an output folder, where they'll be easy to look up.

Example:
The Godfather: Part II would be output as "The Godfather Part II_cover.tbn" and "The Godfather Part II_fanart.tbn"

I didn't implement all the other ideas (such as doing a HEAD http request to check if the image is a missing one) because I didn't find where the *web* URL of the currently-selected cover/fanart was stored in the database, and because what I've got now is fine enough.

It prints status along the way, such as:
[One Flew Over the Cuckoo's Nest] COV:True FAN:True
[24 Hour Party People] COV:True FAN:True
[Glengarry Glen Ross] COV:True FAN:True

The outputname-sanitizer (which is basically just a regexp I wrote) supports unicode such as до свидания so it's safe to run on foreign movies as well.

Anyway, the basic script idea (which is how I used it):
* Edit the script to set your proper user data and output folders
* Execute it to export all found movie covers/fanart (it only deals with the movie library)
* Launch the latest version of XBMC and scroll through the movies
* Anything with missing cover or fan art: Do a Refresh from the menu to grab the latest URLs, have a look at the exported thumb and fan art to see which one was used earlier, pick the same one. If the fan art was custom-imported from the local disk, just point xbmc directly to the exported .tbn file to re-import the previous image. Alternatively, one could even ignore refresh altogether and just completely import the old thumbnails from disk.
* After a little bit of work (but far less than without this script) you'll have re-instated the exact lost covers/thumbnails.

It's just a very cheap method (due to me not knowing the ins and outs of the XBMC database structure) but it works. I would have liked to implement the proper "HEAD request: is this URL still valid? Yes? Do nothing as XBMC will be able to find it again. No? Okay export the old cover/fanart." to only export the REQUIRED ("orphaned") covers/fanarts, and also to support more than just the movie library, but seriously I haven't got time to learn the remaining pieces of the database to do those things.

I still hope someone will get use out of this script in the future, or even adapt it to do more of the intended features.

Variables/strings to change:
xbmcuserdata = your own userdata path, with trailing slash (used to locate the database and thumbnails)
outputpath = desired output folder, with trailing slash, must have write permissions
"MyVideos64.db" = can be changed to lower database version, doesn't matter; use the highest you have
Code:
#!/usr/bin/env python

import sqlite3 as lite
import re
import os.path
import shutil
import sys

def get_crc32( string ):
    string = string.lower()    
    bytes = bytearray(string.encode())
    crc = 0xffffffff;
    for b in bytes:
        crc = crc ^ (b << 24)
        for i in range(8):
            if (crc & 0x80000000 ):
                crc = (crc << 1) ^ 0x04C11DB7
            else:
                crc = crc << 1;                
        crc = crc & 0xFFFFFFFF
        
    return '%08x' % crc

def get_thumbnails( moviename, moviepath, moviefiles ):
    coverid = get_crc32(moviepath)
    coverpath = "Video/%s/%s.tbn" % (coverid[:1], coverid)
    fanartid = get_crc32(moviefiles if moviefiles[:8] == "stack://" else moviepath + moviefiles)
    fanartpath = "Video/Fanart/%s.tbn" % (fanartid)
    return (coverpath, fanartpath)

def copy_thumb( userdatapath, tbn, outputpath ):
    thumbpath = userdatapath + "Thumbnails/" + tbn
    success = False
    if (os.path.isfile(thumbpath)):
        if (os.path.isfile(outputpath)):
            success = True
        else:
            try:
                shutil.copyfile(thumbpath, outputpath)
            except IOError, e:
                print "Unable to copy file: %s" % e
                sys.exit(1)
            finally:
                success = True
    return success

# both paths must be pre-existing and the output path needs write permissions!
xbmcuserdata = "/home/mediacenter/.xbmc/userdata/"
outputpath = "/storage/incoming/xbmcexport/"
con = None

try:
    # note: it's fine to use the latest database version, as it contains all required info as well
    con = lite.connect(xbmcuserdata + "Database/MyVideos64.db")
    
    cur = con.cursor()
    cur.execute("SELECT movie.c00, movie.c22, files.strFilename FROM movie JOIN path ON path.strPath=movie.c22 JOIN files ON files.idPath=path.idPath")
    
    data = cur.fetchall()
    
    pattern = re.compile("[^\w ]+", re.UNICODE)
    for (moviename, moviepath, moviefiles) in data:
        paths = get_thumbnails( moviename, moviepath, moviefiles )
        
        sys.stdout.write("[" + moviename + "] ")
        safemoviename = pattern.sub("", moviename)
        coversuccess = copy_thumb( xbmcuserdata, paths[0], outputpath + safemoviename + "_cover.tbn" )
        fanartsuccess = copy_thumb( xbmcuserdata, paths[1], outputpath + safemoviename + "_fanart.tbn" )
        print "COV:" + str(coversuccess) + " FAN:" + str(fanartsuccess)

except lite.Error, e:
    print "Error: %s" % e.args[0]
    sys.exit(1)

finally:
    if con:
        con.close()

I can also understand why you didn't want to bloat the XBMC source by replicating the old hashing method (since it was pretty complex with many different cases for different library types) just to do an in-place migrate of broken URLs.

However, here's an idea that I just had: If one knew where the currently chosen cover/fanart image URLs were stored in the database, one could verify if the files still exist online, and if not:
* Run the legacy hashing code on the paths + files to get the old local files for cover + fan art (such as "Video/b/b1ae7b4a.tbn")
* When you know the old local file, you now crc32(oldbrokenURL) to get the valid URL-based hash for the broken URL
* Next, copy("Video/b/b1ae7b4a.tbn", "Video/s/omenewhashbasedonURLinstead.tbn")
* Insert an entry in the Textures database pointing the old broken URL to the hashed version of the old broken URL.

Voila, in-place upgrades of the locally cached versions of old broken URLs.

This script is a good idea because there will be a swarm of people with "OMG NEW XBMC SUCKS MISSING HALF MY IMAGES IN MY 5000 MOVIE LIBRARY".

Could even be run automatically ONCE and then delete itself (the script).

The script would basically replicate the exact same thing that can happen in the future anyway:
* The CURRENT (valid) URLs can ALSO become invalid someday, causing the user to have invalidURL->cachedThumb mappings in Textures.db
* So why not import the old, now-dead URLs in the same way; it doesn't matter anyway. All URLs are bound to break some day and an import would help users.

Of course, the best method of all (as you say) would be to not store direct URLs at all, but instead store some form of siteName+movieID+fanartID mapping so that URLs cannot break in the future, since they'll be allowed to move images around as much as they want (short of the site completely dying or the actual art being permanently deleted from the site) without affecting our Textures database/local cache. On the same note, it would then be nice to move from crc32(URL)-named local files to something like "Thumbnails/sitename/movieid_imageid.tbn" to get away from the URL-centric system.

PS: I've now broken my Python virginity. Next on my list: The real life one!
Reply
#21
ATM the plan is to detect "this URL is not valid" and prompt the user what to do about it - in particular, they might want to run the script that Martijn is working on - not only will it fixup the issue with their thumbs, but it may also update rating info and the like.

IMO this is the best solution, it gives "correct" results in the texturedb (without it the image we have cached could in fact be a different image from the one the URL used to point to - not that the user is likely to care) and it also means multi-client setups need only hit the issue once, rather than once per client.

For music we'll certainly be requiring a rescan - ideally only of local tag information (embedded art + the way we determine if folder art is to be assigned to an album requires this) but if any of those URLs are also incorrect then it will involve online lookups as well.

There's really only so much we can do with bw compatibility stuff before it's easier just to enforce a rescan.

Note that Eden's export to a single file/folder structure gives you nicely named thumbnails and fanart (they're put in folders named appropriately), so obviously we'll be recommending that for Frodo stable regardless.

Cheers,
Jonathan
Always read the XBMC online-manual, FAQ and search the forum before posting.
Do not e-mail XBMC-Team members directly asking for support. Read/follow the forum rules.
For troubleshooting and bug reporting please make sure you read this first.


Image
Reply
#22
"ATM the plan is to detect "this URL is not valid" and prompt the user what to do about it - in particular, they might want to run the script that Martijn is working on - not only will it fixup the issue with their thumbs, but it may also update rating info and the like."

That is a good idea, I like it. Speaking of the Refresh feature; will that reset any selected (*valid*) fan/thumb URLs to the defaults too, or just re-grab all XML data while preserving existing art selections?

Edit: No, it does not. Why the heck not!? ;-) So someone wanting to do a full refresh to get up to date IMDb scores would lose all art assignments, heh. The scraper rewrite would do good to separate:
* Metadata (TMDb stuff like title, plot, actors)
* Rating (IMDb/TMDb)
* Image list (store as sitename+movieID+imageID collection rather than static URLs)
* Artwork selection (never lose it even during refresh or if URLs have died)
That would allow independent updating of image list or rating or metadata without ruining the library.

"IMO this is the best solution, it gives "correct" results in the texturedb (without it the image we have cached could in fact be a different image from the one the URL used to point to - not that the user is likely to care)"

Doesn't matter though; sanctity of the URL is not as important as sanctity of the data once it's on the disk. URLs are always subject to change. They may claim them to be static now but just watch a few years down the line when they move the data to a different CDN or change domain or delete certain pieces of artwork, etc, so trying to preserve some sort of validity of the URL is a fool's errand. What would work however is to move to a sitename+movieID+artworkID system; that'd be resilient against everything but total site death or artwork deletion. You mentioned already looking into something like that, so that's very good news.

"Note that Eden's export to a single file/folder structure gives you nicely named thumbnails and fanart (they're put in folders named appropriately), so obviously we'll be recommending that for Frodo stable regardless."

Oh God, I thought the feature spat out some xml files and all the "e55c7b08.tbn" etc files; not that it actually named them nicely. Doh! Well, here's a script anyway that can run on the latest database version with no need to use an older XBMC version. ;-) Also, if someone knows where the currently-chosen art URLs are stored in the database, the "HEAD" request method could be implemented to make it *only* export art that is now missing online, for an even leaner export. It would be very easy to add that to the script if I just knew where the selected art URLs are stored. All I've found so far was the full list of available online art per movie. Not the actual selections.

I know that the selected art URLs HAVE to exist in the old database somewhere, because the new cache re-import found those selections that still had valid URLs...
Reply
#23
The currently chosen image URL is not stored in the Eden db (other than for fanart, and only for the particular case of choosing online fanart - whenever the user changes to a different online fanart image the order of the URLs is switched around so topmost is the correct one) in the old db. Thus the problem.

The backward compatibility of the new system just pulls the first thumb and fanart image it finds (local first, ofcourse, then just the first one in the db). This is correct in the case where the user doesn't change the image after initial scrape, but is wrong if they do (for thumbs at least) - see here for instance:

http://forum.xbmc.org/showthread.php?tid=133308

Fortunately, Frodo should make this problem go away in future.

Cheers,
Jonathan
Always read the XBMC online-manual, FAQ and search the forum before posting.
Do not e-mail XBMC-Team members directly asking for support. Read/follow the forum rules.
For troubleshooting and bug reporting please make sure you read this first.


Image
Reply
#24
(2012-06-06, 05:58)jmarshall Wrote: The currently chosen image URL is not stored in the Eden db (other than for fanart, and only for the particular case of choosing online fanart - whenever the user changes to a different online fanart image the order of the URLs is switched around so topmost is the correct one) in the old db. Thus the problem.

Ah that explains a lot. I *did* lean towards thinking the thumb/fanart hashing was the only thing deciding what image was selected, but was confused by the fact that most movies in my database still had the chosen old artwork selected, so I thought the selection was stored somewhere. Now it makes more sense -> I hadn't changed away from the default images for those movies!

If one was to go to the library, change the "Movies" path to a different scraper (like "None") and then back to Movies, would that re-scrape all metadata and images at once, without causing DUPLICATE entries in the database? Or would it be like doing a full re-scan and inserting duplicates? Just trying to judge whether I should do that to get fresh data anyway since so most of my content used default images.

(2012-06-06, 05:58)jmarshall Wrote: The backward compatibility of the new system just pulls the first thumb and fanart image it finds (local first, ofcourse, then just the first one in the db). This is correct in the case where the user doesn't change the image after initial scrape, but is wrong if they do (for thumbs at least) - see here for instance:

http://forum.xbmc.org/showthread.php?tid=133308

Fortunately, Frodo should make this problem go away in future.

Cheers,
Jonathan

Yeah, huzzah for Frodo! He is a good friend and a good ring-bearer! Wink
Reply
#25
Setting content to None (and telling it to clean out current content) and back to Movies will do a refresh of everything except that stored in the files tables (which are the playcount info, and probably also the dateadded info now).

It will, however, delete any bookmarks you have I think, which would include any resume points.

Cheers,
Jonathan
Always read the XBMC online-manual, FAQ and search the forum before posting.
Do not e-mail XBMC-Team members directly asking for support. Read/follow the forum rules.
For troubleshooting and bug reporting please make sure you read this first.


Image
Reply
#26
Ah so you must also tell it to clean content when you switch to avoid duplicates, alright makes perfect sense. Thanks! Good to know. I'm ready to do it now.

I also don't care about losing date added metadata. My guess as to what will be lost:

* Any custom renaming of movies.
* Art selection.
* Subtitle/aspect ratio/etc per-movie choices.
* Date added to library.
* Bookmarks/resume points.
* Play count.
* Tickmark for "watched".
Reply
#27
For anyone reading this thread, there are various solutions to this issue.

Do one of the following:
* Clear whole library and start over from scratch with a fresh scraper grab of all data
* Export thumbnails+fanart with my script (available above) and then manually refresh (press "i" and then Refresh) movies lacking thumb/fanart and using the exported images as reference.
* Installing an older version of XBMC and doing a data export as Jonathan described, which has the drawback of also outputting .nfo files containing the outdated data, unless you choose to send it all to a single folder in which case it's basically the same result as what my script does, but my script doesn't need you to install an older XBMC version (although if your system doesn't have python installed then installing an old XBMC version takes less time).
* Wait for Martijn's script which automatically adds the highest-rated thumbs/fanart to items lacking them.
I chose to go with a fresh re-scrape to get the latest TMDb data for all movies. I did that as follows:

* Use my standalone script (or an older XBMC version) to export all currently chosen thumbs and fan art to a safe location. Note that this works regardless of having already executed a newer version of XBMC; the database version doesn't matter, and is only used to get the list of movie paths, so use the latest version by all means (yes, the one with "missing" images - it doesn't matter, they're not really missing!)
* Set your movie folders in XBMC to content: None. When it asks if you want to remove all items under that path from your library, say Yes.
* Go to XBMC's settings and run "Clean Library" to clear up any junk that may be left behind.
* Shut down XBMC (IMPORTANT) and go to your userdata folder and delete Database/Textures*.db (all versions of the Textures database must go!) and the whole "Thumbnails" folder (you're best off just deleting the gigabytes of old, misnamed junk and letting it re-build the whole thing with the new organization method).
* Start XBMC again and go to your movie folders and set them back to Movies and set up your scraper etc (remember that it will have lost any special settings like IMDb ratings rather than TMDb ratings, as well as per-movie scraper settings, although the latter really mainly matters for tv series where it's COMMON to need per-show overrides of episode numbering scheme).
* Let it do its job. It will download the latest info for every movie, and choose the topmost thumb and fan art.
* Finally you can browse through your movie list and change any fan art you want to change back to what you had earlier, such as when you had imported your own artwork. Use the exported backup from step 1.

YOU WILL LOSE SOME DATA BY DOING THIS:
* Any custom renaming of movies.
* Art selection.
* Subtitle/aspect ratio/etc per-movie choices.
* Date that the movie was added to library.
* Bookmarks/resume points.
* Play count.
* Tickmark for "watched".

I only cared about the "Watched" status, and took care of that as follows:
* Before I cleared my database, I went to my Movies list and set it to "Show only Watched" and took a screenshot of the whole list.
* After the re-import, I manually marked those as watched again.
* Yeah, it's low-tech, but it works. No sense in writing some SQL database manipulation / export / import tool for something that can be solved with some screenshots! ;-)



I actually did it for TV shows as well (the complete data purge) since I only had a small number of shows, and didn't care that my script doesn't export tv show thumbs/fanart. The watched status per-episode is of course lost, but in my case it doesn't bother me as I hadn't watched many things. Your mileage may vary.

It is a good idea to do a full database reset and re-import this way now that the thumbnail caching method has changed so drastically, and it is an opportunity to get updated metadata for all media.
Reply
#28
Is there a thread or something detailing the thumbnail cache rewrite? I searched here and the wiki and couldn't really find anything. Is the rewrite complete or still in progress?
Reply
#29
Actually, Jonathan, this only clears the movie and tvshow tables, but not the files, path, bookmarks, settings or path tables, so it MAY preserve actual per-file playback settings. It only removes the actual LIBRARY index of what the files belong to. I'll clarify my "what you will lose" list above if you know what implications this has.

I noticed that the "files" table has added-times for a few items, but most of them are blank in that field.

Either way, I'm personally going to delete the actual MyVideos*.db files as well, to remove lingering junk and start fresh. Obviously I am not recommending that extra step to inexperienced users, but it's nice to remove absolutely all legacy gunk and start with the new indexing style throughout the board.
(2012-06-06, 07:19)RockDawg Wrote: Is there a thread or something detailing the thumbnail cache rewrite? I searched here and the wiki and couldn't really find anything. Is the rewrite complete or still in progress?

It is complete (barring any small fixes) and if you build XBMC from Git (or use prebuilt nightlies if you are not on Linux) you'll be on it automatically.

They do however plan to someday improve things further, to rely less on URLs and more on something like image IDs, to avoid this situation where changed URLs mean that images cannot be re-downloaded. That's a "someday maybe" goal though, so don't hold your breath for it.

As for how the new rewrite works, you'll get a pretty good picture of the changes if you read this thread, but basically it moved from IDs based on the LOCATION of your local media:
* Fanart = crc32(/path/to/moviefolder/American History X/American History X.mkv)
* Thumb = crc32(/path/to/moviefolder/American History X/)
To one based on web URL of image data:
* Fanart/Thumb ID = crc32(http://somesite.com/someimage.jpg)

This change of IDs means that your old "Thumbnails" folder is full of thousands of images of junk that will never again be used (unless you go back to an older version of XBMC).

When you upgrade, I do recommend a full library clear out as I've just described, just because the new system is so radically different that you might as well clear all your old data and re-scrape all movie/tv show/music/etc information to get fresh data as a bonus. As long as you back up the old userdata folder and take precautions like knowing that you'll lose certain info (like "Watched" status), then it's the best way forward. You get to enjoy a much lighter, re-structured Thumbnails folder free of years of junk and all your media will have the latest up-to-date information (including benefits from scraper add-on improvements since you last scanned, and so on).
Reply
#30
well keeping watched data worked for me the last time with settings source to none and back to movies again
Read/follow the forum rules.
For troubleshooting and bug reporting, read this first
Interested in seeing some YouTube videos about Kodi? Go here and subscribe
Reply

Logout Mark Read Team Forum Stats Members Help
Attention: Thumbnail cache rewrite's unintended consequences1