Is there a way to tell R or the RCurl package to give up trying to load a web page if it exceeds the specified period of time and go to the next line of code? For instance:
> library(RCurl) > u = "http://photos.prnewswire.com/prnh/20110713/NY34814-b" > getURL(u, followLocation = TRUE) > print("next line")
This will just hang on my system and not go to the last line.
EDIT: Based on @Richie Cotton's answer below, although I can “sort of” achieve what I want, I don’t understand why it takes longer than expected. For example, if I do the following, the system freezes until I select / select the option “Misc → Buffered Output” in RGUI:
> system.time(getURL(u, followLocation = TRUE, .opts = list(timeout = 1))) Error in curlPerform(curl = curl, .opts = opts, .encoding = .encoding) : Operation timed out after 1000 milliseconds with 0 out of 0 bytes received Timing stopped at: 0.02 0.08 ***6.76***
SOLUTION: Based on the @Duncan post below, and then looking at the curl docs, I found a solution using the maxredirs option as follows:
> getURL(u, followLocation = TRUE, .opts = list(timeout = 1, maxredirs = 2, verbose = TRUE))
Thank you,
Tony braial
O/S: Windows 7 R version 2.13.0 (2011-04-13) Platform: x86_64-pc-mingw32/x64 (64-bit) attached base packages: [1] stats graphics grDevices utils datasets methods base other attached packages: [1] RCurl_1.6-4.1 bitops_1.0-4.1 loaded via a namespace (and not attached): [1] tools_2.13.0
source share