Failed to access files from open s3 bucket using boto

I recently created a new AWS account (call her “Account A”) and created an S3 bucket in that account (let me call it “bucketa”) by downloading the file foo.txt. Following the advice from the internet , I set up what, in my opinion, is the most acceptable bucket policy (it should allow any access to any user):

{ "Version": "2012-10-17", "Id": "PolicyABCDEF123456", "Statement": [ { "Sid": "StmtABCDEF123456", "Effect": "Allow", "Principal": "*", "Action": "s3:*", "Resource": [ "arn:aws:s3:::bucketa/*", "arn:aws:s3:::bucketa" ] } ] } 

After creating the IAM user for account A ("Identity and Access Management → Users → Create New Users" and creating a user with "Generate Access Key for Each User") and saving the user credentials in ~ / .boto, a simple script using the interface boto S3 can access the downloaded foo.txt file:

 import boto conn = boto.connect_s3() b = conn.get_bucket("bucketa", validate=False) k = boto.s3.key.Key(b) k.key = "foo.txt" print len(k.get_contents_as_string()) # 9 

Then I created a new AWS account (I will call it “Account B”) and followed the same steps, saving the IAM credentials in a .boto file and running the same python script. However, in this case, I get an error 403 when I execute the line print len(k.get_contents_as_string()) :

 Traceback (most recent call last): File "access.py", line 7, in <module> print len(k.get_contents_as_string()) File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1775, in get_contents_as_string response_headers=response_headers) File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1643, in get_contents_to_file response_headers=response_headers) File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1475, in get_file query_args=None) File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 1507, in _get_file_internal override_num_retries=override_num_retries) File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 343, in open override_num_retries=override_num_retries) File "/usr3/josilber/.local/lib/python2.7/site-packages/boto/s3/key.py", line 291, in open_read self.resp.reason, body) boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden <?xml version="1.0" encoding="UTF-8"?> <Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>7815726085F966F2</RequestId><HostId>EgFfldG4FoA5csuUVEKBq15gg3QQlQbPyqnyZjc2fp5DewlqDZ4F4HNjXYWQtBl5MUlSyLAOeKA=</HostId></Error> 

Why can't account B access bucketa despite its very permissive bucket policy? Are there any additional permissions that I need to configure to provide open access to other AWS accounts?

Note. . I already ruled out the invalid credentials in the BB Boto file as the culprit by creating the bucketb S3 bucket from account B with the same bucket policy ("bucketa" is replaced by "bucketb" "in two lines"; in this case, I can access bucketb with account credentials B, but get the same 403 error when using Bucket A credentials

+5
source share
1 answer

A policy that allows anonymous users access to the bucket. However, in your case, account B is not an anonymous user, it is an authenticated AWS user, and if you want this user to have access, you will need to provide it explicitly in the policy. Or you can access it anonymously in boto:

 conn = boto.connect_s3(anon=True) 

That should do the trick. This, of course, goes without saying, but I would not leave this policy as it is. This will allow someone to dump everything they want into a bucket.

+4
source

All Articles