Large file from ec2 to s3

I have a 27 gigabyte file that I am trying to move from AWS Linux EC2 to S3. I tried the "S3put" command and the "S3cmd put" command. Both work with a test file. None of them work with a large file. No errors were issued, the command returns immediately, but nothing happens.

s3cmd put bigfile.tsv s3://bucket/bigfile.tsv 
+6
source share
3 answers

Although you can upload objects to S3 with sizes up to 5 TB, S3 has a size limit of 5 GB for a single PUT operation.

To download files larger than 5 GB (or even files larger than 100 MB), you will want to use the S3 multi-page download function.

http://docs.amazonwebservices.com/AmazonS3/latest/dev/UploadingObjects.html

http://aws.typepad.com/aws/2010/11/amazon-s3-multipart-upload.html

(Ignore the obsolete description of the 5 GB object restriction in the blog post above. The current limit is 5 TB.)

The boto library for Python supports multi-page loading, and the latest boto software includes the s3multiput command-line tool, which takes care of the complexities for you and even parallelizes the loading of parts.

https://github.com/boto/boto

+5
source

File does not exist, doh. I realized this after running the s3 commands in verbose mode by adding the -v tag:

 s3cmd put -v bigfile.tsv s3://bucket/bigfile.tsv 
0
source

s3cmd version 1.1.0 supports multi-page loading as part of the put command, but it is still in beta (currently.)

0
source

Source: https://habr.com/ru/post/928144/


All Articles