Kodi Community Forum

Full Version: XBMC & remote library sync
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
Hey guys,

Long time reader, first time poster. I have been scouring the net for answers to my issues. I, like many of you I'm sure, have my own htpc running XBMC, but along with that I'm also in charge of managing several remote site HTPC's. These offsite machines are several hours away. I'd like a way of keeping the libraries on all of the remote machines in sync with my server in an automated fashion. I don't want a cloud solution as there is no need to store all of this data there, just a direct sync between sites. I've ventured into sftp's (my current method) as well as VPN's but without setting up a tunnel b/t sites it can be cumbersome. I've played w/ Msft Synctoy for backing up my library to my externals, but my only method of getting the library onto the remote htpcs has been to load it on an external drive and run winmerge/synctoy when i physically show up at that site. I'm currently in the middle of converting old home movies from VHS to the computer and having them copy over as soon as I place them in a folder would be such a time saver.

Any suggestions would be appreciative. Free solutions would be even better.

Or if it is possible to have a front end similar to maraschino alert the remote sites that new files are available for download and allow those users to select what they want to download and retrieve from my server into theirs.

Again thanks!
http://wiki.xbmc.org/index.php?title=HOW...sing_MySQL

Obviously there are a few more wrinkles in getting it to work over an untrusted network, but that is what VPN is for.

I assume you have phenomenally fast internet speeds if you are serving media from site to site.
I wouldn't use MySQL over the Internet. That's just asking for painful slowdowns.
Yeah but if he has internet fast enough to be doing what he is contemplating it shouldn't be too bad should it?

MySQL also has remote replication - it is normally seen as a backup or load balancing thing, but could it also be used for this? I dunno.
Doesn't MySQL only move the meta data?? Or are the files stored within the SQL database itself?.

Basically what I need is say I have a drive in my server with the following file structure e:\home movies\ and I create a new folder named 1988 and place 2 or 3 home movies in that folder. I would then want it to replicate on a scheduled time to remote server with the same structure as e:\home movies\1988

I plan on my next trip to bring all the latest ones on external and copy over to my parents Htpc so that future remote syncing require utilizing my bandwidth for hours and hours. My connection is only 30MB down 10MB up so I know it will be slower looking to up it in the coming months.

Basically creating on offsite backup system as well
Something like OwnCloud or CrashPlan allows you to synchronise folders automatically over the internet.

CrashPlan will allow computer-to-computer syncing for free, you only need to pay if you wish to use their Cloud services.

You could quite easily set up CrashPlan on both systems, and keep the folders in sync - yours as the "master" and your parent's as the "backup" location.

If you set up the MetaData with NFO files, then XBMC will use these to give you identical data in the XBMC database, without actually sharing it (I'm guessing you don't know or care whether your parents have watched Resident Evil and your parents don't care whether you've watched My Little Pony) - as exposing MySQL over the internet will be 1) risky, and 2) slow.
Lol...my little pony.. nice.

Have you used crashplan before? Do you know if the data is encrypted during transfer? The meta data on the other end i'm not really worried about as that is the only computer playing content so no mysql setup on that end.
rsync is ideal for syncing data across multiple locations.
Rsync
Rsync is the typical protocol used by hosting mirrors to keep all the mirrors up to date. One is the master and the other all pull the content from it.
Although very complete it can also be quite complex to set up, and it runs best under linux.

FTP
In case you need something running under Windows, I would simply use FTP like you are doing and script the upload from your end, having it run every night for instance.
You would need to set up an FTP server on their end, and an FTP client on your end that supports scripting.

Encryption?
Why are you using VPN / FTPS ? Are you super concerned with encrypting this content while it is being sent ?
I ask this because it adds quite a lot of overhead to the jobs.

Your 10 Mbit up is not much but will be enough to copy a few movies over per day. I would do the initial sync manually by taking a disk over and performing a local copy.

Torrent
It might not seem like the most likely candidate, but the distributed nature of the torrent protocol makes it an excellent client for this task.
You could set up your own private torrent tracker on your system, and create a torrent file for every new VHS you upload.
Using FTP / VPN or even mail you can then send these torrent files to your remote locations, where you (or someone else) needs to put them in the watch folder.
The systems will start to download your torrent from your machine, but also upload acquired parts to eachother.
Without doubt torrent is the fastest way to get one file to multiple locations, and should be relatively easy to set up.

I used this one time over a local switch to distribute a 40GB preconfigured virtual machine file to 20 machines (for a course, students had to learn about virtualization). Torrenting the file to the workstations completely saturated both the download and the upload of every port on the switch until all workstations finished almost at the same time. The server had uploaded just over 40 GB of data, and so had almost every workstation, effectively speeding the upload up almost 20 times.
thanks guys for the suggestions i think i'll give rsync a try!
Yeah I quite like the idea of a private torrent tracker for this task, but only if you have a number of nodes to transfer to. (ie it produces no advantage if you are replicating HostA to HostB, but will be an advantage if you are replicating HostA to HostB and HostC because a lot of the traffic will go between HostB and HostC (in both directions) rather than flooding HostA's link to download it twice.
Yeah I'd like to make it where Host A replicates to HostB, HostB to Host C, Host C has 2 independent xbmc's (ones a raspberry pi other a mac mini) setup over a long range wifi (1 network 2 houses lol) and then Let Host C send to a future Host D that i'm working on now.

When i visit family i spend more time updating machines than I do visiting haaha.
You can't underestimate the bandwidth of a DVD disk in the mail.
(2013-06-17, 02:09)crypticknight Wrote: [ -> ]Yeah I'd like to make it where Host A replicates to HostB, HostB to Host C, Host C has 2 independent xbmc's (ones a raspberry pi other a mac mini) setup over a long range wifi (1 network 2 houses lol) and then Let Host C send to a future Host D that i'm working on now.

When i visit family i spend more time updating machines than I do visiting haaha.

This should all be possible with rsync pulls from A to B, and then pull from B to C etc. The same script should work on each host, just change the source. Unless you have static IPs at each site, you probably want to look into getting a dynamic dns entry for each site.

I have a remote NAS (FreeNAS) that automatically runs an rsync script to pull new media updates from my "home" FreeNAS server, and connected to the remote NAS is a Pi running OpenELEC with /storage mounted over NFS and MySQL media library (which is also accessed by a Revo 3700 with OpenELEC).

When the rsync job has finished, if any new items have been synced the NAS will initiate a video library scan on the Pi, followed by an Artwork Downloader scan (for both movies and tv shows), and finally ensure that all artwork is correctly cached on the Pi - this final step could also be performed on the Revo but it's not always guaranteed to be switched on.

I'm als able to remote into the NAS and perform various XBMC maintenance tasks directly from the NAS, such as identifying any movies or tvshows with missing plots or artwork and having them automatically re-scanned (assuming the missing plots/artwork and other items have now been added) - all using JSON queries run against the Pi.
(2013-06-17, 03:16)nickr Wrote: [ -> ]You can't underestimate the bandwidth of a DVD disk in the mail.

Yea....just another reason for someone to call me and ask "what do I do now"

Yea whats the proverb about give a man a fish, it doesnt' work lol. Tried teaching and I get more frustrated if i just figure a way to do it myself and leave all the nontechnical folks out of the loop...of course they probably came to that realization first.... trickery!!
Pages: 1 2