Rubyzip: export zip file directly to S3 without burning tmpfile to disk?

I have this code that writes a zip file to disk, reads it back, loads it into s3, and then deletes the file:

compressed_file = some_temp_path Zip::ZipOutputStream.open(compressed_file) do |zos| some_file_list.each do |file| zos.put_next_entry(file.some_title) zos.print IO.read(file.path) end end # Write zip file s3 = Aws::S3.new(S3_KEY, S3_SECRET) bucket = Aws::S3::Bucket.create(s3, S3_BUCKET) bucket.put("#{BUCKET_PATH}/archive.zip", IO.read(compressed_file), {}, 'authenticated-read') File.delete(compressed_file) 

This code works already, but I want not to create a zip file to save a few steps. I was wondering if there is a way to export zipfile data directly to s3 without having to first create a tmp file, read it and delete it?

+6
source share
1 answer

I think I just found the answer to my question.

This is Zip :: ZipOutputStream.write_buffer . I will check this and update this answer when I earn it.

Update

He works. My code is as follows:

 compressed_filestream = Zip::ZipOutputStream.write_buffer do |zos| some_file_list.each do |file| zos.put_next_entry(file.some_title) zos.print IO.read(file.path) end end # Outputs zipfile as StringIO s3 = Aws::S3.new(S3_KEY, S3_SECRET) bucket = Aws::S3::Bucket.create(s3, S3_BUCKET) compressed_filestream.rewind bucket.put("#{BUCKET_PATH}/archive.zip", compressed_filestream.read, {}, 'authenticated-read') 

write_buffer returns StringIO and should rewind the stream first read . Now I do not need to create and delete the tmp file.

I am just wondering now if write_buffer will have more memory large or heavy than open ? Or is it the other way around?

+8
source

All Articles