Download files from a secure ftp server using R

I need to download files from the sftp server and parse them and paste them into the content in the database.

I am currently using rCurl as follows:

library(RCurl) url<-c("sftp://data.ftp.net/incomining.data.txt") x<-getURL(url, userpwd="<id>:<passwd>") writeLines(x, incoming.data.txt")) 

I also looked at the download.file file and I do not see sftp sufpport in the download.file file. Has anyone else done a similar job? Since I will receive multiple files, I noticed that rcurl sometimes runs out of time. I like to download all the files from the sftp server first and then process it. Any ideas?

+3
r
source share
1 answer

It seems the question is this: "How to avoid timeouts in rcurl?"

Increase the value of CURLOPT_CONNECTTIMEOUT. This is really the same problem as setting Curl timeout in PHP .

Edit, from the comments below:

 x<-getURL(url, userpwd="<id>:<passwd>", connecttimeout=60) // 60 seconds, eg 
+2
source share

All Articles