I am developing a file upload service for my company. Our users often send us very large ZIP files filled with very large illustrator files. As a rule, files will not be larger than 1.5 GB, but I need to plan file processing up to 4 GB.
Obviously, this raises a lot of concerns about how I configured my Apache to allow such large file transfers without overloading my server or opening security holes.
Group 1: Memory Limits
One of the features of my system is that users should be able to upload their files after they are downloaded. I avoided using the standard download (file link only) due to security issues - I cannot allow other users to download each other's files. My solution was to save the downloaded files in a secure directory outside of www-root and upload them through a PHP script. The PHP script looks something like this:
$fo = fopen($uploadDir.$file_name, "r");
while(!feof($fo)) {
$strang = fread($fo, 102400);
echo $strang;
ob_flush();
}
fclose($fo);
fread . , 4 , (, 20 ppl). : . 1,5- - .
2:
/ . , . 1 /, 35 . , , 15 .
max_excution_time 30 ? - ? , , , , script . , ?
, - java silverlight. , , java. swfupload jQuery.