Alternative to scp, transferring files between Linux machines by opening parallel connections

Is there an alternative to scp to transfer a large file from one computer to another, opening parallel connections, and also be able to pause and resume downloading.

Please do not submit this to severfault.com. I am not a system administrator. I am a developer trying to migrate past databases between backup hosts and servers.

thanks

+4
source share
4 answers

You can try using split (1) to split the file and then view the parts in parallel. Then the file can be combined into one file on the destination machine using "cat".

# on local host split -b 1M large.file large.file. # split into 1MiB chunks for f in large.file.*; do scp $f remote_host: & done # on remote host cat large.file.* > large.file 
+10
source

Like Mike K, answer https://code.google.com/p/scp-tsunami/ - it handles file splitting, starting with several scp processes to copy the parts, and then joins them again ... it can also copy to multiple hosts ...

  ./scpTsunami.py -v -s -t 9 -b 10m -u dan bigfile.tar.gz /tmp -l remote.host 

This splits the file into 10 MB chunks and copies them using 9 scp processes ...

+4
source

Take a look at rsync to see if it meets your needs.

Correct placement of questions does not depend on your role, but on the type of question. Since this is not strictly related to programming, it is likely that it will be ported.

+3
source

The program you are running is lftp. It supports sftp and parallel transfers using the pget command. It is available under Ubuntu (sudo apt-get install lftp) and you can read it here:

http://www.cyberciti.biz/tips/linux-unix-download-accelerator.html

+1
source

Source: https://habr.com/ru/post/1311803/


All Articles