Amazon S3 - Download a batch file using the Java API?

We want to start using S3 for some of our storage needs, and I'm looking for a way to batch upload the ā€œNā€ files. I already wrote the code using the Java API to perform separate file downloads, but is there a way to provide a list of files to transfer to the S3 bucket?

I considered the following question is-it-possible-to-perform-a-batch-upload-to-amazon-s3 , but this is two years ago, and I'm curious, the situation has changed in general. I cannot find a way to do this in code.

What we want to do is set up an internal task (possibly using scheduled tasks in Spring) for filegroup transitions every night. I would like for you to have a way to do this, and not just iterate over them and make a request for each of them, or do zip packages to places on S3.

+4
source share
2 answers

The easiest way to go if you are using the AWS SDK for Java is the TransferManager . Its uploadFileList method takes a list of files and uploads them to S3 in parallel, or uploadDirectory will upload all files in a local directory.

+3
source

public void uploadDocuments (List filesToUpload) throws AmazonServiceException, AmazonClientException, InterruptedException {AmazonS3 s3 = AmazonS3ClientBuilder.standard (). WithCredentials (getCredentials ()). WithRegion (Regions.AP_SOUTH_1) .build ();

  TransferManager transfer = TransferManagerBuilder.standard().withS3Client(s3).build(); String bucket = Constants.BUCKET_NAME; MultipleFileUpload upload = transfer.uploadFileList(bucket, "", new File("."), filesToUpload); upload.waitForCompletion(); } private AWSCredentialsProvider getCredentials() { String accessKey = Constants.ACCESS_KEY; String secretKey = Constants.SECRET_KEY; BasicAWSCredentials awsCredentials = new BasicAWSCredentials(accessKey, secretKey); return new AWSStaticCredentialsProvider(awsCredentials); } 
0
source

All Articles