How to efficiently copy all files from one directory to another in an Amazon S3 bucket using boto?

I need to copy all the keys from '/ old / dir /' to '/ new / dir /' into the Amazon S3 bucket. I came up with this script (quick hack):

import boto

s3 = boto.connect_s3()
thebucket = s3.get_bucket("bucketname")
keys = thebucket.list('/old/dir')
for k in keys:
    newkeyname = '/new/dir' + k.name.partition('/old/dir')[2]
    print 'new key name:', newkeyname
    thebucket.copy_key(newkeyname, k.bucket.name, k.name)

While it works, but much slower than what I can do manually in the graphics management console by simply copying / past with the mouse. Very upsetting and there are many keys to copy ...

Do you know a faster method? Thank you

Edit: maybe I can do this with simultaneous copy processes. I am not very familiar with the methods of copying key copies and the number of parallel processes that I can send to amazon.

Edit2: I am currently studying Python multiprocessing. Let's see if I can send 50 copies at the same time ...

3: 30 , Python. , , . ( > 5Gb): boto . script.

+5
1

5 : S3 5 PUT, boto (. boto, Amazon S3).

, , , . , boto ( )

+1

All Articles