How to upload large files via PHP script

Using PHP, I try to serve large files (possibly 200 MB) that do not fall into a directory accessible on the Internet due to authorization issues. I am currently using the readfile() call along with some headers to work with a file, but it seems like PHP is loading it into memory before sending it. I intend to deploy to a shared hosting server that will not allow me to use a lot of memory or add my own Apache modules such as the X-Sendfile.

I cannot allow my files to be in a directory accessible on the Internet for security reasons. Does anyone know a method with less memory that I could deploy to a shared hosting server?

EDIT:

 if(/* My authorization here */) { $path = "/uploads/"; $name = $row[0]; //This is a MySQL reference with the filename $fullname = $path . $name; //Create filename $fd = fopen($fullname, "rb"); if ($fd) { $fsize = filesize($fullname); $path_parts = pathinfo($fullname); $ext = strtolower($path_parts["extension"]); switch ($ext) { case "pdf": header("Content-type: application/pdf"); break; case "zip": header("Content-type: application/zip"); break; default: header("Content-type: application/octet-stream"); break; } header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\""); header("Content-length: $fsize"); header("Cache-control: private"); //use this to open files directly while(!feof($fd)) { $buffer = fread($fd, 1*(1024*1024)); echo $buffer; ob_flush(); flush(); //These two flush commands seem to have helped with performance } } else { echo "Error opening file"; } fclose($fd); 
+26
php memory download
Jun 29 2018-11-22T00:
source share
4 answers

If you use fopen and fread instead of readfile , this should solve your problem.

There's a solution in the PHP readfile documentation that shows how to use fread to accomplish what you want.

+10
Jun 29 2018-11-22T00:
source share

To download large files from the server, I changed the settings below in the php.ini file:

 Upload_max_filesize - 1500 M Max_input_time - 1000 Memory_limit - 640M Max_execution_time - 1800 Post_max_size - 2000 M 

Now I can upload and download 175MB video on the server. Since I have a dedicated server. So these changes were easy.

Below is the PHP script file to download the file. I have not made any changes to this code snippet for a large file size.

 // Begin writing headers ob_clean(); // Clear any previously written headers in the output buffer if($filetype=='application/zip') { if(ini_get('zlib.output_compression')) ini_set('zlib.output_compression', 'Off'); $fp = @fopen($filepath, 'rb'); if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE")) { header('Content-Type: "$content_type"'); header('Content-Disposition: attachment; filename="'.$filename.'"'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header("Content-Transfer-Encoding: binary"); header('Pragma: public'); header("Content-Length: ".filesize(trim($filepath))); } else { header('Content-Type: "$content_type"'); header('Content-Disposition: attachment; filename="'.$filename.'"'); header("Content-Transfer-Encoding: binary"); header('Expires: 0'); header('Pragma: no-cache'); header("Content-Length: ".filesize(trim($filepath))); } fpassthru($fp); fclose($fp); } elseif($filetype=='audio'|| $filetype=='video') { global $mosConfig_absolute_path,$my; ob_clean(); header("Pragma: public"); header('Expires: 0'); header('Cache-Control: no-store, no-cache, must-revalidate'); header('Cache-Control: pre-check=0, post-check=0, max-age=0'); header("Cache-Control: public"); header("Content-Description: File Transfer"); header("Content-Type: application/force-download"); header("Content-Type: $content_type"); header("Content-Length: ".filesize(trim($filepath))); header("Content-Disposition: attachment; filename=\"$filename\""); // Force the download header("Content-Transfer-Encoding: binary"); @readfile($filepath); } else{ // for all other types of files except zip,audio/video ob_clean(); header("Pragma: public"); header('Expires: 0'); header('Cache-Control: no-store, no-cache, must-revalidate'); header('Cache-Control: pre-check=0, post-check=0, max-age=0'); header("Cache-Control: public"); header("Content-Description: File Transfer"); header("Content-Type: $content_type"); header("Content-Length: ".filesize(trim($filepath))); header("Content-Disposition: attachment; filename=\"$filename\""); // Force the download header("Content-Transfer-Encoding: binary"); @readfile($filepath); } exit; 
+4
Apr 11 2018-12-12T00:
source share

If you care about performance, there is xsendfile available in apache, nginx and lighttpd as a module. Check the readfile() doc user comments.

There are also modules for these web servers that accept a URL with an optional hash value that allows you to upload a file for a short period of time. It can also be used to resolve authorization issues.

+3
Jun 29 '11 at 23:26
source share

You can also handle this in the style of the Gordian knot, that is, bypass the problem completely. Keep the files in an inaccessible directory, and at boot you can simply

 tempstring = rand(); symlink('/filestore/filename.extension', '/www/downloads'.tempstring.'-filename.extension'); echo("Your download is available here: <a href='/downloads/'.tempstring.'-filename.extension'); 

and configure cronjob to unlink() any download links older than 10 minutes. Virtually no processing of your data is required, without massaging the HTTP headers, etc.

There are even several libraries for this purpose.

+2
Jun 29 '11 at 23:25
source share



All Articles