Reduce memory usage in PHP when handling downloads with php login

Nginx 1.0.5 + php-cgi works for me (PHP 5.3.6). I need to upload ~ 1 GB files (there should be 1-5 parallel downloads). I am trying to create upload of large files via ajax download. Everything works, but PHP eats a lot of memory for each load. I set memory_limit = 200M, but it works up to ~ 150MB of uploaded file size. If the file is larger, the download fails. I can set memory_limit more and more, but I think this is wrong, because PHP can eat all the memory. I use this PHP code (simplified) to handle server side downloads:

$input = fopen('php://input', 'rb'); $file = fopen('/tmp/' . $_GET['file'] . microtime(), 'wb'); while (!feof($input)) { fwrite($file, fread($input, 102400)); } fclose($input); fclose($file); 

/etc/nginx/nginx.conf:

 user www-data; worker_processes 100; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; client_max_body_size 2g; # server_tokens off; server_names_hash_max_size 2048; server_names_hash_bucket_size 128; # server_names_hash_bucket_size 64; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Gzip Settings ## gzip on; gzip_disable "msie6"; include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } 

/etc/nginx/sites-enabled/srv.conf:

 server { listen 80; server_name srv.project.loc; # Define root set $fs_webroot "/home/andser/public_html/project/srv"; root $fs_webroot; index index.php; # robots.txt location = /robots.txt { alias $fs_webroot/deny.robots.txt; } # Domain root location / { if ($request_method = OPTIONS ) { add_header Access-Control-Allow-Origin "http://project.loc"; add_header Access-Control-Allow-Methods "GET, OPTIONS, POST"; add_header Access-Control-Allow-Headers "Authorization,X-Requested-With,X-File-Name,Content-Type"; #add_header Access-Control-Allow-Headers "*"; add_header Access-Control-Allow-Credentials "true"; add_header Access-Control-Max-Age "10000"; add_header Content-Length 0; add_header Content-Type text/plain; return 200; } try_files $uri $uri/ /index.php?$query_string; } #error_page 404 /404.htm location ~ index.php { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $fs_webroot/$fastcgi_script_name; include fastcgi_params; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param PATH_INFO $fastcgi_script_name; add_header Pragma no-cache; add_header Cache-Control no-cache,must-revalidate; add_header Access-Control-Allow-Origin *; #add_header Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-File-Name"; } } 

Does anyone know how to reduce PHP memory usage? Thanks.

+8
javascript ajax php upload nginx
source share
3 answers

There is a hack that is associated with faking the content header, turning it from application/octet-stream to multipart/form-data . This will stop PHP from populating $ HTTP_RAW_POST_DATA. Read more https://github.com/valums/file-uploader/issues/61 .

+4
source share

They used to be in the same shoe, and that’s what I divided the files into different pieces during the download process.

A good example is the use of [1]: http://www.plupload.com/index.php "pulpload" or an attempt to use the java applet http://jupload.sourceforge.net , which also has the ability to resume if a network problem occurs etc.

The most important thing is that you want your files to be downloaded via a web browser, it is noted that you do not do this in pieces

+1
source share

Why don't you try using flash to download huge files. For example, you can try swfupload , which has good PHP support.

0
source share

All Articles