How to write 1 millisecond Excel files with openxlsx package in R

I have users who cannot or do not want to connect to relational databases, but prefer to work with data exported to excel files. Records exported from this database data can become quite large. (I also export to CSV files).

My question has to do with this: java.lang.OutOfMemoryError processing for recording into Excel from the R .

As recommended in the accepted anchor to this question (or, rather, to the first comment), now I use the openxlsxRcpp-based package to export some views from the database. It works when the export has ~ 67000 rows, but it does not work for large data sets (~ 1 million rows, ~ 20 parameters, all numeric, except for a few dates).

openxlsx::write.xlsx(data, file = "data.2008-2016.xlsx") # 800000 rows

Error: zipping up workbook failed. Please make sure Rtools is installed or a zip application is available to R.
         Try installr::install.rtools() on Windows

(I am using a Linux computer, and / usr / bin / zip is available for R)

Can I provide the openxlsx package with more memory? Or set some configurable options to work better with large datasets?

For openxlsx, is there something like a options(java.parameters = "-Xmx1000m")java-based xlsx package?

openxlsx . , ? (, )

: , Rstudio, , db, write.xlsx(). " ", , 800000 93 -xlsx.

+4

All Articles