Get_bucket () gives a "bad request" for S3 codes that I did not create through Boto

I use Boto to try to get a bucket in Amazon S3, but it returns a Bad Request when I use get_bucket () for some of the buckets. I'm starting to wonder if this is a bug with Boto, since I can get the bucket using get_all_buckets ().

>>> from boto.s3.connection import S3Connection >>> conn = S3Connection(S3_ACCESS_KEY, S3_SECRET_KEY) >>> buckets = conn.get_all_buckets() >>> buckets [<Bucket: mysite-backups>] >>> buckets[0] <Bucket: mysite-backups> >>> conn.get_bucket('mysite-backups') Traceback (most recent call last): File "<console>", line 1, in <module> File "/path/to/virtualenv/lib/python2.7/site-packages/boto/s3/connection.py", line 502, in get_bucket return self.head_bucket(bucket_name, headers=headers) File "/path/to/virtualenv/lib/python2.7/site-packages/boto/s3/connection.py", line 549, in head_bucket response.status, response.reason, body) S3ResponseError: S3ResponseError: 400 Bad Request >>> conn.create_bucket('mysite_mybucket') <Bucket: mysite_mybucket> >>> conn.get_bucket('mysite_mybucket') <Bucket: mysite_mybucket> 

This seems to be a problem even if I log in with the same user account when I use access accounts and create them from the AWS console.

Any idea why this could be happening?

+8
python amazon-s3 boto
source share
5 answers

It turns out the problem is with the region (I used Frankfurt). Two ways to deal with it:

  • Give up Frankfurt (@andpei indicates the problems that are currently being reported ) and recreate a bucket in another region.

  • Specify the region using the host parameter when connecting (thanks @Siddarth):

     >>> REGION_HOST = 's3.eu-central-1.amazonaws.com' >>> conn = S3Connection(S3_ACCESS_KEY, S3_SECRET_KEY, host=REGION_HOST) >>> conn.get_bucket('mysite-backups') <Bucket: mysite-backups> 

    You can find the appropriate host area here .

+15
source share

Use the connection to the region when working with buckets in different regions.

+2
source share

A general and simple solution that is not related to changing the region or installing a specific host is at https://github.com/boto/boto/issues/2916 . After some editing:

The Frankfurt AWS region (Ireland and CN, too, apparently) only support the V4 signature algorithm. (...)

In the documentation for boto, you can add [s3] use-sigv4 = True to your ~/.boto or set the os.environ list to include S3_USE_SIG_V4: os.environ['S3_USE_SIGV4'] = 'True' .

+1
source share

Add s3 host file to boto connection

conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, host=AWS_HOST)

+1
source share

I had to use a combination of EOL and seddonym answers - at first I indicated that I would like to use Sigv4 with the following:

 os.environ['S3_USE_SIGV4'] = 'True' 

Then, connecting to the bucket, I had to specify a host that seemed a little dumb, but here it goes:

 s3 = S3Connection('key id', 'access key', host='s3.eu-central-1.amazonaws.com') 

Please note that this is for Frankfurt only.

0
source share

All Articles