From the discussion in the comments, I came to some key points that I would like to share.
Pre-Signed URLs
As @ceejayoz pointed out, pre-signed
URLs are a good idea because:
- I can hold the time for up to 10 seconds, which is ideal for any redirects and start the download, but not enough for the link to be available.
- My previous understanding was that the download should complete at a given time. Therefore, if the link expires in 10 seconds, the download should happen before that. But @ceejayoz pointed out that this is not so. Startup download is allowed.
- With the
cloud front
I can also limit the IP address to add extra security.
IAM Roles
He also pointed out another not-so-good method - to create temporary IAM users. This is a maintenance nightmare if itโs not done right, so only if you know what you are doing.
Stream From S3
This is the method that I have chosen now. Maybe later I will move on to the first method.
Warning: If you are streaming, then your server is still an intermediary, and all data will pass through your server. Therefore, if it does not work or is slow, loading will be slow.
My first question was how to register stream wrapper
:
Since I use Laravel and laravel uses flysystem to control S3, there was no easy way for me to get S3Client. So I added an extra AWS SDK for Laravel
package to my composer.json
"aws/aws-sdk-php-laravel" : "~3.0"
Then I wrote my code as follows:
class FileDelivery extends Command implements SelfHandling { private $client; private $remoteFile; private $bucket; public function __construct($remoteFile) { $this->client = AWS::createClient('s3'); $this->client->registerStreamWrapper(); $this->bucket = 'mybucket'; $this->remoteFile = $remoteFile; } public function handle() { try { // First get the meta-data of the object. $headers = $this->client->headObject(array( 'Bucket' => $this->bucket, 'Key' => $this->remoteFile )); $headers = $headers['@metadata']; if($headers['statusCode'] !== 200) { throw new S3Exception(); } } catch(S3Exception $e) { return 404; } // return appropriate headers before the stream starts. http_response_code($headers['statusCode']); header("Last-Modified: {$headers['headers']['last-modified']}"); header("ETag: {$headers['headers']['etag']}"); header("Content-Type: {$headers['headers']['content-type']}"); header("Content-Length: {$headers['headers']['content-length']}"); header("Content-Disposition: attachment; filename=\"{$this->filename}\""); // Since file sizes can be too large, // buffers can suffer because they cannot store huge amounts of data. // Thus we disable buffering before stream starts. // We also flush anything pending in buffer. if(ob_get_level()) { ob_end_flush(); } flush(); // Start the stream. readfile("s3://{$this->bucket}/{$this->remoteFile}"); } }
My second question was Do I need to Disable output buffering
in laravel?
IMHO answer is yes. Buffering allows you to immediately remove data from the buffer, which reduces memory consumption. Since we do not use any laravel function to offload data to the client, this is not done by laravel and therefore we need to do this.