Creating a new table from a dplyr object without using R-memory

I use dplyr to manipulate large tables in PostgreSQL. After several manipulations, I have a large view that I want to save in a new table. A rough solution is to load it into the R memory and write it to the database. This sounds awful since the new table is only CREATE + SQL created by dplyr. Is there any way to apply CREATE or UPDATE to a dplyr database object using inline functions?

+4
source share
1 answer

Hoping to get a few points by translating Hadley’s useful comment in response. Also, I think there may have been some changes that allow copy_tothis to be done (at least it seems to have worked for me).

data(iris)
remoteDb <- src_postgres(dbname="irisDb",host="remoteDB.somewhere.com",
                          port=5432,user="yourUser",password="yourPass")

irisSql <- copy_to(remoteDb,iris,"iris_table",temporary=FALSE)

irsSqlPermanent <- compute(irisSql, name="iris_table_permanent", temporary=FALSE) 

The first two lines capture the standard "iris" R dataset and establish a connection (in this case, Postgres).

The line copy_touses what appears to be an undocumented argument temporarythat allows the data frame to be stored in the database (found in the error report). The string computealso works as intended, but I'm not sure if you need it temporary=FALSEwith copy_to.

+3
source

All Articles