PHP Stop Remote File Download if it exceeds 5 MB

How to stop a deleted file from downloading a file if it exceeds 5 MB? If I stop it during transfer, will the file be stored somewhere in another temporary directory or in memory? How do I know? Here is my current code:

$url = 'http://www.spacetelescope.org/static/archives/images/large/heic0601a.jpg'; $file = '../temp/test.jpg'; file_put_contents($file, file_get_contents($url)); 
+5
source share
4 answers

There are several ways you could do this, but since you are currently using file_get_contents() , here is what I would do:

  • Open the remote file with fopen()
  • Read the file and save it with fread() - this will allow you to track the current size and throw and erroneously when passing the threshold.

Something like that:

 $url = 'http://www.spacetelescope.org/static/archives/images/large/heic0601a.jpg'; $file = '../temp/test.jpg'; $limit = 5 * 1024 * 1024; // 5MB if (!$rfp = fopen($url, 'r')) { // error, could not open remote file } if (!$lfp = fopen($file, 'w')) { // error, could not open local file } // Check the content-length for exceeding the limit foreach ($http_response_header as $header) { if (preg_match('/^\s*content-length\s*:\s*(\d+)\s*$/', $header, $matches)) { if ($matches[1] > $limit) { // error, file too large } } } $downloaded = 0; while (!feof($rfp) && $downloaded < $limit) { $chunk = fread($rfp, 8192); fwrite($lfp, $chunk); $downloaded += strlen($chunk); } if ($downloaded > $limit) { // error, file too large unlink($file); // delete local data } else { // success } 

Note: it's tempting to check the Content-Length: header before you get any file to check if the file is too large - you can still do this if you want, but don't trust this value! The header is essentially an arbitrary value, and although it would be a protocol violation to use a content length that does not match the file size, it can be used to trick the system into downloading a file that violates the rules. You will need to count bytes even if you check this value.

You can also do this using curl via a data callback, but I find this a less satisfactory solution, mainly because curl requires a string function name and not a standard callable type, which means you will need to either use a global a variable, or a static variable, to track the loaded length of the content, none of which is acceptable (IMHO) for this.

+8
source

Regarding the definition of the URL and the size limit, it is not too complicated ... but file_get_contents will not do this. You could fopen url and use stream_get_meta_data in the stream to get an array of stream information. If I read example to the right (and if the transfer encoding is not "untied"), you will find Content-Length , which indicates the file size. If the value is set and exceeds your limit, a deposit.

If the metadata of the stream does not have this information, or the information says that the file is small enough, you still have the stream; you can still fread from the remote file and fwrite to local in pieces until you read everything or reach the limit. Of course, if you get to this point, the file will exist as soon as you reach the limit; You will have to delete it yourself.

+2
source

This is not possible in file_get_contents - you will need to use the PHP cURL extension to capture the file size.

 <?php $max_length = 5 * 1024 * 1024; $url = 'http://www.spacetelescope.org/static/archives/images/large/heic0601a.jpg'; $ch = curl_init($url); curl_setopt($ch, CURLOPT_NOBODY, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_HEADER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); $data = curl_exec($ch); curl_close($ch); if ($data === false) { echo 'cURL failed'; exit; } $contentLength = 'unknown'; if (preg_match('/Content-Length: (\d+)/', $data, $matches)) { $contentLength = (int)$matches[1]; } echo "Content-Length: $contentLength\n"; if ($contentLength > $max_length) { echo "File is too large\n"; } else { echo "File size is ok\n"; } 

Wrap this in a function and use it to check the URL - anything too large may be skipped.

Downside:

This method only works if the Content-Length header is returned by the server - not all servers behave well, and not all will return the correct length.

Upside potential:

No need to load and write up to $max_length bytes. This method will be faster and less resource intensive.

Modified by:

http://php.net/filesize#92462

+1
source

Here is a variation of the DaveRandom solution along the same lines, but without writing the file until it is fully downloaded.

 <?php define('MAXMB', 5); define('BYTESPERMB', 1048576); $maxsize = MAXMB * BYTESPERMB; $handle = fopen("http://www.example.com/", "rb"); $contents = ''; $completed = true; while (!feof($handle)) { $chunk = fread($handle, 8192); $size += strlen($chunk); if ($size > $maxsize) { $completed = false; break; } $contents .= $chunk; } fclose($handle); if ($completed) { // File appears to be completely downloaded so write it -- file_put_contents('path/to/localfile', $contents); } else { // It was too big } 
+1
source

All Articles