Large file downloads

Do large file downloads block the application request / response cycle? I have an application that allows users to upload several large files (in particular, images). These files are stored on a remote host. I cannot use asynchronous background jobs to download these images, because they should be immediately available to the user after the download is complete. What is the best way I should handle such large downloads? Does it affect concurrency? This is my first time with large scale downloads. What should I be wary of besides huge bills, of course? Any input from developers who created applications that use large file downloads would be greatly appreciated.

+4
source share
2 answers

Why can't you use asynchronous loading and just handle an event that means it was made? Typically, how asynchronous operations work - you start them, and then store the pointer somewhere, and then either handle the "Full" event, or simply periodically iterate through the pointers for the downloads you started, and check them to see if itโ€™s completed whether it is.

+2
source

This is an old question, but I was still worried about the same problem when downloading large files, thinking that the processes are blocked while the file is loading, but it turned out that if I understood correctly that nginx and possibly other servers also buffer the contents of the file when sending it, so the processing of rails are not blocked only when the download is complete, and the rails process it, for example, resizing images or something like that.

+1
source

All Articles