"s3cmd get" overwrites local files

Trying to download the S3 directory to a local computer using s3cmd . I use the command:

s3cmd sync --skip-existing s3://bucket_name/remote_dir ~/local_dir 

But if I restart the download after the break, s3cmd will not skip the previously downloaded local files and overwrite them. What is wrong with the team?

+7
source share
2 answers

Use boto-rsync instead. https://github.com/seedifferently/boto_rsync

It correctly syncs only new / changed files from s3 to the local directory.

+3
source

I had the same problem and found a solution in comment No. 38 from William Dennis there http://s3tools.org/s3cmd-sync

If you have:

 $s3cmd sync β€”verbose s3://mybucket myfolder 

Change it to:

 $s3cmd sync β€”verbose s3://mybucket/ myfolder/ # note the trailing slash 

Then the MD5 hashes are compared and everything works correctly! -skip-existing works.

To repeat, both -skip-existing and md5-verification will not happen if you use the first command, and both work if you use the second (I made a mistake in my previous message, since I tested 2 different directories).

+15
source

All Articles