Download large Apache / PHP file (> 2Gb)

I am using a PHP script to control access to download files. This works fine for something under 2Gb, but not for large files.

  • Apache and PHP are 64-bit
  • Apache allows you to upload a file if it is available directly (which I cannot allow)

PHP junk (ignoring access control):

if (ob_get_level()) ob_end_clean(); error_log('FILETEST: '.$path.' : '.filesize($path)); header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename='.basename($path)); header('Expires: 0'); header('Cache-Control: must-revalidate'); header('Pragma: public'); header('Content-Length: ' . filesize($path)); readfile($path); exit; 

Error log shows file size

 [Tue Apr 08 11:01:16 2014] [error] [client *.*.*.*] FILETEST: /downloads/file.name : 2251373807, referer: http://myurl/files/ 

But the access log is negative in size:

  *.*.*.* - - [08/Apr/2014:11:01:16 +0100] "GET /files/file.name HTTP/1.1" 200 -2043593489 "http://myurl/files/" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:24.0) Gecko/20100101 Firefox/24.0" 

And so browsers refuse to download the file. In fact, using wget, it does not send anything:

 $ wget -S -O - http://myurl/files/file.name --2014-04-08 11:33:38-- http://myurl/files/file.name HTTP request sent, awaiting response... No data received. Retrying. 
+8
php apache
source share
4 answers

Try to read the file in chunks and put them in the browser instead of filling your local memory with 2 GB and rinsing it completely.

Replace readfile($path); on:

 @ob_end_flush(); flush(); $fileDescriptor = fopen($file, 'rb'); while ($chunk = fread($fileDescriptor, 8192)) { echo $chunk; @ob_end_flush(); flush(); } fclose($fileDescriptor); exit; 

8192 bytes is critical in some cases, refer to php.net/fread .

Adding some microtime variables (and comparing with the file descriptor pointer position) will also allow you to control the maximum download speed.

* (Flushing the output buffer is also slightly dependent on the web server, use these commands to make sure that it at least tries to flush as much as possible.)

+6
source share

I encountered this problem before and used the below script to upload files, it breaks the file into pieces to load large files, and not try to take the whole file right away. This script also takes into account browser usage, as some browsers (namely IE) may handle the headers in a slightly different way.

 private function outputFile($file, $name, $mime_type='') { $fileChunkSize = 1024*30; if(!is_readable($file)) die('File not found or inaccessible!'); $size = filesize($file); $name = rawurldecode($name); $known_mime_types=array( "pdf" => "application/pdf", "txt" => "text/plain", "html" => "text/html", "htm" => "text/html", "exe" => "application/octet-stream", "zip" => "application/zip", "doc" => "application/msword", "xls" => "application/vnd.ms-excel", "ppt" => "application/vnd.ms-powerpoint", "gif" => "image/gif", "png" => "image/png", "jpeg"=> "image/jpg", "jpg" => "image/jpg", "php" => "text/plain" ); if($mime_type=='') { $file_extension = strtolower(substr(strrchr($file,"."),1)); if(array_key_exists($file_extension, $known_mime_types)) $mime_type=$known_mime_types[$file_extension]; else $mime_type="application/force-download"; } @ob_end_clean(); if(ini_get('zlib.output_compression')) ini_set('zlib.output_compression', 'Off'); header('Content-Type: ' . $mime_type); header('Content-Disposition: attachment; filename="'.$name.'"'); header("Content-Transfer-Encoding: binary"); header('Accept-Ranges: bytes'); header("Cache-control: private"); header('Pragma: private'); header("Expires: Mon, 26 Jul 1997 05:00:00 GMT"); if(isset($_SERVER['HTTP_RANGE'])) { list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2); list($range) = explode(",",$range,2); list($range, $range_end) = explode("-", $range); $range=intval($range); if(!$range_end) $range_end=$size-1; else $range_end=intval($range_end); $new_length = $range_end-$range+1; header("HTTP/1.1 206 Partial Content"); header("Content-Length: $new_length"); header("Content-Range: bytes $range-$range_end/$size"); } else { $new_length=$size; header("Content-Length: ".$size); } $chunksize = 1*($fileChunkSize); $bytes_send = 0; if ($file = fopen($file, 'r')) { if(isset($_SERVER['HTTP_RANGE'])) fseek($file, $range); while(!feof($file) && (!connection_aborted()) && ($bytes_send<$new_length) ) { $buffer = fread($file, $chunksize); print($buffer); flush(); $bytes_send += strlen($buffer); } fclose($file); } else die('Error - can not open file.'); die(); } 
0
source share

Add code before readfile ($ path);

 ob_clean(); flush(); 

I use this code to download:

 if (file_exists($file)) { header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename='.basename($file)); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Pragma: public'); header('Content-Length: ' . filesize($file)); ob_clean(); flush(); readfile($file); exit; } 
0
source share

Your best bet is to force apache to use http chunked mode with this feature. This way you save a lot of PHP memory.

 function readfile_chunked($filename, $retbytes = TRUE) { $CHUNK_SIZE=1024*1024; $buffer = ''; $cnt =0; $handle = fopen($filename, 'rb'); if ($handle === false) { return false; } while (!feof($handle)) { $buffer = fread($handle, $CHUNK_SIZE); echo $buffer; @ob_flush(); flush(); if ($retbytes) { $cnt += strlen($buffer); } } $status = fclose($handle); if ($retbytes && $status) { return $cnt; // return num. bytes delivered like readfile() does. } return $status; } 
0
source share

All Articles