Gsutil rsync with gzip compression

I host public static resources in google slave storage, and I want to use the command gsutil rsyncto synchronize our local version with the bucket, save bandwidth and time. Part of our build process is to pre-gzip these resources, but gsutil rsynchave no way to set the header Content-Encoding. This means that we have to start gsutil rsyncand then immediately start gsutil setmetato set the headers for all gzipped file types. This leaves the bucket in BAD state until this header is set. Another option is to usegsutil cpby passing the -z option, but this requires us to reload the entire directory structure every time, and this includes many image files and other gzip-free resources that lose time and bandwidth.

Is there an atomic way to execute rsync and set the correct Content-Encoding headers?

+4
source share
2 answers

Assuming you start with gzipped source files in source-dir, you can do:

gsutil -h content-encoding:gzip rsync -r source-dir gs://your-bucket

Note. If you do this and then run rsync in the opposite direction, it will unzip and copy all the objects:

gsutil rsync -r gs://your-bucket source-dir 

, . , rsync - , .

+5

, , , :

- Google

- gs bucket

  • , gzip (html, css, js...), - .
  • Gzip gzip ( )
  • gsutil rsync gs

, ,

gzip

gsutil -m -h Content-Encoding:gzip rsync -c -r src/gzip gs://dst

gzippped

gsutil -m rsync -c -r src/none gs://dst

-m . -c ( gsutil rsync ?), , -r .

script ( ): http://tekhoow.blogspot.fr/2016/10/deploying-static-website-efficiently-on.html

+2

All Articles