Large file downloads

I am working on an application that allows you to upload and store large files on a web server. I am currently using PHP to process POST files via http. I have php.ini with:

upload_max_filesize = 100M
post_max_size = 100M
memory_limit = 128M
max_input_time = 6000
max_execution_time = 6000

It seems that there is no apache directive for LimitRequestBody. I use APC to track file downloads. For some reason, file download always stops at exactly 50M.

I know that http is not the most efficient solution for downloading files, but this application should be user-friendly, and I understand that there are problems with the ftp firewall.

I am wondering if anyone can give me any suggestions as to what stops my download exactly 50M? It must be some kind of configuration setting.

Also, is there any other way I have to work with using javascript / PHP and http to upload files. I looked through java applets and used flash. You are probably going to use swfuploader, but if its server configuration, due to which my download fails with http, I really don’t see how it will be using the java applet or the flash loader.

I should note that I hope to come up with a solution that will allow me to upload very large files up to 1 GB at some point.

I use very simplified PHP to get the file

$uploaddir = '/'.$_POST['upload_directory'].'/';
$uploadfile = $uploaddir . basename($_FILES['file']['name']);

if (is_uploaded_file($_FILES['file']['tmp_name'])) {
    if (move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile)) {  some success code; }

There is obviously a little more than that, but that’s the essence of how I handle the download.

+5
source share
2 answers

. Extjs . , , , MAX_ UPLOAD _FILESIZE POST 50M, , . . .

+4

All Articles