Surprisingly, I could not find a (free and opensource) tool in Linux that states precisely in their manuals that they support
syncing directory in
ftp with a local directory in my computer.
That I said "
syncing" I mean it will only fetch the changes from the last sync into my local directory only instead of re-downloading the whole thing again which can be cumbersome.
That I said "
ftp",
yes, I really mean the retarded ftp which I am always against it. If I could, I would never use
ftp. As you may already guess, I have to use it as I have no choice. The company I work with subscribe to several shared web hostings which usually come with retarded ftp server as the only mean to transfer files between our local computer and the servers which just totally
SUCKS.
The first thing came up to my mind was, of course,
rsync - but the problem is that it does not support ftp.
The second thought was to use "
mirror" command of
lftp. However, the manual page does not say exactly whether or not it does "sync". I have a hunch that it can only downloads new files but can't keep the local directory in sync with what are on the ftp server. So this does not work either.
The third idea is what I chose. Though, later I found that there were problems due to low quality of local ADSL Internet connection and etc which made it difficult to download directory that had many subdirectories and lots of small files. Anyway, I suppose if your Internet quality is not bad, this can be a good try.
The idea is to use
curlftpfs to mount a directory in an ftp server to a local directory in your harddisk. Now that you can access files (in ftp) as if they were your local files, you can choose any of your favourite backup tools to sync files in ftp and your local directory. I use
rdiff-backup to create versions of files in ftp in my local harddisk.
See part 2 where I figured out how to sync ftp directory via lftp:
how to sync ftp directory with a local directory in linux [part 2]