New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Best way to copy large files from server to server (Thats not Rsync)
This discussion has been closed.
Comments
@hyelton - I'll let you have a Linux or Windows VM on a gigabit network with 500GB of storage. For 48 hours. Sound like a deal?
The issue you are seeing with how long files take is likely the overhead from encryption, as the encryption will take a good bit of CPU and on servers with low resources this can cause a slow down. This is also why FTP (assuming you are using it without SSL) would be much faster, as you are removing the encryption thus allowing more CPU for just file transfer. Honestly, if you are worried about how long its taking FTP may be your best bet followed by using rsync without encryption (without SSH). Also, any type of compression will also slow things down, as once again you will be relying on CPU time to compress things. If you are not concerned with someone sniffing your data, then finding a method to transfer which doesn't use encryption will be significantly faster than with.
Cheers!
If the servers are close, you can always try NFS as well.
Looks like either the source or destination server does not have enough processing power and rsync over SSH is slow because of that. Try using rsyncd. If you want something simpler, then try playing with the compression and encryption options. For a decent guide, see https://gist.github.com/KartikTalwar/4393116 or http://janaksingh.com/blog/faster-rsync-over-ssh-using-arcfour-encryption-146
If you insist on using FTP, then try wget with the 'mirror' option to directly copy the files to the destination server from the source server. See this http://serverfault.com/questions/25199/using-wget-to-recursively-download-whole-ftp-directories
Using a Windows server in between is probably not the best option. If you insist on doing that, then try Vultr or any other provider which allows hourly billing and supports Windows.
I'd actually suspect rsync to be faster than FTP, due to the lack of roundtrips (FTP is very bad with this).
Also, SSH isn't slow because it encrypts, it's slow because of the way it buffers up data, and how that is affected by latency. I forgot the specifics, though. There's a patch around somewhere that fixes that. Certainly, it's not the encryption causing it.
I'll echo the others' comments. 1) rsync isn't slow, there must be something else wrong. 2) try btsync.
You can try lftp, supports sftp and can use multiple connections.
Why you just dont compress the files and wget them?
Well its about 800GB of files, AND my old server takes about 5min to compress 55MB so theres a problem with that lol.
Source: Install a simple webserver and turn directory listings on.
Target: wget -r --no-parent http://mysite.com/mydirectory/
What rsync command are you using? People ask but you didn't say. You can disable compression by not including the -z flag.
Also check the CPU usage when rsync is running, see if it's maxing out the CPU, if so then that's probably because of the encryption overhead.
Installing webserver and using wget is always easy option.
It already has a Webserver I havent tried that command.
How do I turn Directory listings on? I currently use a PHP directory listing, but going directly to it is 403.
Figured it out
THanks it works!, But I just choose /Downloads but it copies everything thats before the folder as well.
syncthing https://syncthing.net/
Maybe packing everything or using containers (like truecrypt) and after downloading via http/ftp? .. less size,more protection
If you really want a gui-like option you could install mc on the src side and FTP/SFTP the files over the FISH protocol. Midnight Commander is available from almost every linux dist standard repo and has an ncurses gui. Not sure if FISH supports resuming or not..
I mount my backup servers hard drive to my webserver using sshfs, then just cp the file. Never had any problems with it
What rsync command are you using? People ask but you didn't say. You can disable compression by not including the -z flag.
Also check the CPU usage when rsync is running, see if it's maxing out the CPU, if so then that's probably because of the encryption overhead.
Installing webserver and using wget is always easy option..you can also use third party software to rysnc large file...i am using [some shitty bullshittery tool] for rysnc large number of files.
Avada Kedavra, necromancer!