AJAX download - without waiting for a response before proceeding

I use Blueimp jQuery Uploader (it’s very good that this is too useful) and the S3 handler to upload files, and then transfer them to S3 via the S3 API (from the PHP SDK).

It works. The problem is that in large files (> 1 GB) it may take something up to several minutes to transfer (via create-object) to S3. The PHP file that does this freezes until this process is complete. The problem is that the loader (which uses the jQuery Ajax method ) seems to give up waiting and starts again every time.

I thought this was due to PHP INI 'max_input_time' or one that seemed to wait about 60 seconds, although now it is changing. I increased max_input_time in PHP INI and other related ones - but no further.

I also considered (more likely) that JS, either in a script or in a jQuery method, has a timeout. The developer (blueimp) said that there is no such timeout in the script interface, and I did not see it, and although the "timeout" refers to the parameters of the jQuery Ajax method, it seems to affect the entire load time, and not the wait for the response - so little use.

Any help or guidance gratefully received.

+7
source share
2 answers

jQuery docs at http://api.jquery.com/jQuery.ajax/ say:

time-out

Set the timeout (in milliseconds) for the request. This will be undone by any global timeout specified with $ .ajaxSetup (). The timeout period begins at the point the call to $ .ajax is called; if a few more requests are progress and the browser has no connections available, it is possible to request a timeout before sending it. In jQuery 1.4.x and below, the XMLHttpRequest object will be in an invalid state if the request time; access to any object may be an exception. Only in Firefox 3.0+ script and JSONP requests cannot be canceled by timeout; The script will work even if it arrives after a waiting period.

Also, it might also be a good idea to check php set_time_limit settings and php max memory settings.

In any case, the best approach, it seems to me, to perform an error callback, for example,

$.ajax('yourScript.php',{ error:function(jqXHR){ echo this error ... } }); 

jQuery docs say:

error callbacks are called in the order in which they are logged if the request fails. They get jqXHR, a string indicating the error type and exception object, if applicable. Some built-in errors will provide a string as an exception object: "abort", "timeout", "No Transport".

This may give you a hint about who (the server or client) stopped the transfer.

Hope this helps

+1
source

Downloading large files at a time can cause all sorts of problems. It is best to split the files into chunks on the client side and then send them to the server, one chunk at a time, and the server will re-sort the chunks into the source file before pushing it to S3.

While you can use window.FileReader and File.prototype.slice , you can use javascripts .split() in the file to cut it off.

+1
source

All Articles