Unable to set file content type to S3

How to set the content type in a file in a S3-enabled web hosting account through the Python boto module?

I do:

from boto.s3.connection import S3Connection from boto.s3.key import Key from boto.cloudfront import CloudFrontConnection conn = S3Connection(access_key_id, secret_access_key) bucket = conn.create_bucket('mybucket') b = conn.get_bucket(bucket) b.set_acl('public-read') fn = 'index.html' template = '<html>blah</html>' k = Key(b) k.key = fn k.set_contents_from_string(template) k.set_acl('public-read') k.set_metadata('Content-Type', 'text/html') 

However, when I access it from http://mybucket.s3-website-us-east-1.amazonaws.com/index.html , my browser offers me to download the file instead of just serving it as a web page.

Looking at the metadata in the S3 management console, it is shown that the Content-Type was actually set to "application / octet-stream". If I manually change it in the console, I can access this page as usual, but if I run my script again, it resets it back to the wrong content type.

What am I doing wrong?

+7
python amazon-s3 boto
source share
2 answers

The set_metadata method set_metadata really intended to set user metadata on S3 objects. Many standard HTTP metadata fields have first-class attributes to represent them, for example. content_type . In addition, you want to set metadata before sending an object to S3. Something like this should work:

 import boto conn = boto.connect_s3() bucket = conn.get_bucket('mybucket') # Assumes bucket already exists key = bucket.new_key('mykey') key.content_type = 'text/html' key.set_contents_from_string(mystring, policy='public-read') 

Note that you can set canned ACL policies while writing an object to S3, which saves the need to make another API call.

+14
source share

I was not able to force the above solution to actually save my changes to the metadata.

Perhaps because I used the file and it changed the content type using mimetype? I also upload m3u8 and ts files for HLS encoding so that it also gets in the way.

Anyway, this is what worked for me.

 import boto conn = boto.connect_s3() bucket = conn.get_bucket('mybucket') key_m3u8 = Key(bucket_handle) key_m3u8.key = s3folder+"/"+s3keyname key_m3u8.metadata = {"Content-Type":"application/x-mpegURL","Cache-Control":"public,max-age=8"} key_m3u8.set_contents_from_filename("path_to_my_file", policy="public-read") 
0
source share

All Articles