We have an intermediate server setup to use another S3 bucket from our production server, but this requires that we manually synchronize the images between the buckets in order to see the images on stage. Since we have tens of thousands of images (growing daily), this is not viable.
In any case, to set up Carrierwave to read images from our S3 products, but write any new images to the S3 stage (so as not to pollute our image store)?
UPDATE . I tried to create my own storage engine for CarrierWave that does this (see this method - basically identical to the fog storage engine, except line 228), but I get this error when trying to get images:
Excon::Errors::SocketError (hostname does not match the server certificate (OpenSSL::SSL::SSLError)): lib/carrier_wave/storage/dual_fog.rb:214:in `exists?' lib/carrier_wave/storage/dual_fog.rb:228:in `public_url' lib/carrier_wave/storage/dual_fog.rb:267:in `url'
Does anyone know why this is? As you can see from the code in essence, I want this solution to be able to read from the stage and retreat to production if the image is not found during production. However, all write operations should be performed only at the stage.
source share