Upload directory to s3 using boto

I am already connected to the instance and I want to download the files created from my python script directly to S3. I tried this:

import boto s3 = boto.connect_s3() bucket = s3.get_bucket('alexandrabucket') from boto.s3.key import Key key = bucket.new_key('s0').set_contents_from_string('some content') 

but it is more likely to create a new s0 file with a context of "the same content", while I want to load the s0 directory in mybucket.

I also looked at s3put, but I was not able to get what I want.

+8
amazon-s3 amazon-web-services amazon-ec2 boto
source share
4 answers

There is nothing in the boto library itself that would allow the entire directory to be boto . You can write your own code to go through the directory using os.walk or similar, and upload each individual file using boto.

Boto calls s3put a command line utility that can handle this, or you can use the AWS CLI tool , which has many functions that allow you to load entire directories or even synchronize the S3 bucket with a local directory or vice versa.

+10
source share

The following function can be used to load a directory into s3 via boto.

  def uploadDirectory(path,bucketname): for root,dirs,files in os.walk(path): for file in files: s3C.upload_file(os.path.join(root,file),bucketname,file) 

Specify the directory path and bucket name as inputs. Files are placed directly in the bucket. Change the last variable of the upload_file () function to put them in "directories".

+3
source share

You can do the following:

 import os import boto3 s3_resource = boto3.resource("s3", region_name="us-east-1") def upload_objects(): try: bucket_name = "S3_Bucket_Name" #s3 bucket name root_path = 'D:/sample/' # local folder for upload my_bucket = s3_resource.Bucket(bucket_name) for path, subdirs, files in os.walk(root_path): path = path.replace("\\","/") directory_name = path.replace(root_path,"") for file in files: my_bucket.upload_file(os.path.join(path, file), directory_name+'/'+file) except Exception as err: print(err) if __name__ == '__main__': upload_objects() 
0
source share

To read folder form files we can use

 import boto from boto.s3.key import Key keyId = 'YOUR_AWS_ACCESS_KEY_ID' sKeyId='YOUR_AWS_ACCESS_KEY_ID' bucketName='your_bucket_name' conn = boto.connect_s3(keyId,sKeyId) bucket = conn.get_bucket(bucketName) for key in bucket.list(): print ">>>>>"+key.name pathV = key.name.split('/') if(pathV[0] == "data"): if(pathV[1] != ""): srcFileName = key.name filename = key.name filename = filename.split('/')[1] destFileName = "model/data/"+filename k = Key(bucket,srcFileName) k.get_contents_to_filename(destFileName) elif(pathV[0] == "nlu_data"): if(pathV[1] != ""): srcFileName = key.name filename = key.name filename = filename.split('/')[1] destFileName = "model/nlu_data/"+filename k = Key(bucket,srcFileName) k.get_contents_to_filename(destFileName) 
0
source share

All Articles