Best practice for handling large amounts of data while waiting for a user (in Rails)?

I have a bookmarklet that, when used, sends all the URLs on the current browser page to the Rails 3 application for processing. Behind the scenes, I use Typhoeus to verify that each URL returns a 2XX status code. I am currently starting this process through an AJAX request to the Rails server and just waiting for it to process and return the results. For a small set, this is very fast, but when the number of URLs is quite large, the user can wait up to, say, 10-15 seconds.

I looked at using a Delayed Job to handle this outside of a user thread, but that doesn't seem like the correct use case. Since the user needs to wait until the processing is completed in order to see the results, and the Delayed Job may take up to five seconds before starting work, I can not guarantee that the processing will be completed as soon as possible. In this case, this waiting time is unacceptable.

Ideally, I think this will happen:

  • User enters bookmarklet
  • Data is sent to the server for processing
  • The waiting page returns instantly when the stream for processing is canceled
  • The waiting page periodically checks through ajax the processing results and updates the waiting page (for example: "4 of 567 processed URLs ...")
  • The wait page is updated with the results when they are ready.

:

  • Heroku ( 30 )

, ? , - Delayed Job, ( Heroku)? .

+5
1

, . url-check ( URL- , , ). , , ( , ). AJAX, , , , . , - .

+1

All Articles