Why does readfile () run out of PHP memory?

I saw a lot of questions about how to use PHP effectively for file uploads, and not for direct HTTP requests (to protect files, track downloads, etc.).

The answer is almost always PHP readfile () .

  • Upload large files in PHP
  • How to force downloading large files without using too much memory?
  • Best way to transparently download logs?

BUT, although it works great during testing with huge files, when it is on a live site with hundreds of users, downloads start to freeze and PHP's memory limits are exhausted.

So, how does readfile() , which causes the memory to explode so much when the traffic is high? I thought it should get around the heavy PHP memory usage by writing directly to the output buffer?

EDIT: (To clarify, I'm looking for β€œwhy,” not β€œwhat can I do.” I think Apache mod_xsendfile is the best way around)

+21
php memory readfile
Jul 08 '11 at 17:12
source share
6 answers
 Description int readfile ( string $filename [, bool $use_include_path = false [, resource $context ]] ) Reads a file and writes it to the output buffer*. 

PHP should read the file and write to the output buffer. So, for a 300 MB file, no matter what version you wrote (for many small segments or 1 large chunk), PHP should end up reading 300 MB of file.

If multiple users need to upload a file, a problem will occur. (On one server, hosting providers limit the memory provided to each hosting user. In such limited memory, using a buffer would not be a good idea.)

I think using a direct link to download a file is a much better approach for large files.

+4
Jul 08 2018-11-11T00:
source share

If you have output buffering than using ob_end_flush () right before calling readfile ()

 header(...); ob_end_flush(); @readfile($file); 
+2
Nov 16 '12 at 12:57
source share

You can generally disable output buffering for this specific location using the PHP directive output_buffering configuration .

Apache example:

 <Directory "/your/downloadable/files"> ... php_admin_value output_buffering "0" ... </Directory> 

Turn off because the value seems to work too, while it really should cause an error. At least according to how other types are converted to booleans in PHP . * Shrugs *

+1
Oct 05 '12 at 15:30
source share

I came up with this idea in the past (as part of my library) to avoid using high memory:

 function suTunnelStream( $sUrl, $sMimeType, $sCharType = null ) { $f = @fopen( $sUrl, 'rb' ); if( $f === false ) { return false; } $b = false; $u = true; while( $u !== false && !feof($f )) { $u = @fread( $f, 1024 ); if( $u !== false ) { if( !$b ) { $b = true; suClearOutputBuffers(); suCachedHeader( 0, $sMimeType, $sCharType, null, !suIsValidString($sCharType)?('content-disposition: attachment; filename="'.suUniqueId($sUrl).'"'):null ); } echo $u; } } @fclose( $f ); return ( $b && $u !== false ); } 

Perhaps this can give you some inspiration.

+1
Jul 25 '13 at 14:03
source share

As already mentioned here: "Allowed memory .. exhausted" when using readfile, the following code code at the top of the php file helped.

This will check if php output buffering is active. If so, it turns off.

 if (ob_get_level()) { ob_end_clean(); } 
+1
Aug 17 '16 at 1:26
source share

Well, this is intense memory. I would connect users to a static server that has a specific rule for managing downloads installed instead of using readfile ().

If this is not an option, add more RAM to meet the load or introduce a queuing system that elegantly controls server usage.

0
Jul 08 2018-11-11T00:
source share



All Articles