I'm having trouble trying to use large objects in R. For example:
> memory.limit(4000) > a = matrix(NA, 1500000, 60) > a = matrix(NA, 2500000, 60) > a = matrix(NA, 3500000, 60) Error: cannot allocate vector of size 801.1 Mb > a = matrix(NA, 2500000, 60) Error: cannot allocate vector of size 572.2 Mb
I understand that this is due to the difficulty of getting contiguous blocks of memory (from here ):
Error messages cannot allocate a size vector, indicate the inability to receive memory, either because the size exceeded the address space for the process or, more likely, because the system was unable to provide memory. Note that on a 32-bit assembly there may well be enough free memory, but not a large enough adjacent block, the address space into which it can be mapped.
How can I get around this? My main difficulty is that I get to a certain point in my script, and R cannot allocate 200-300 MB for an object ... I cannot preallocate a block because I need memory for other processing, This even happens then when I delete unnecessary objects.
Edit : Windows XP SP3, 4 GB RAM, R 2.12.0:
> sessionInfo() R version 2.12.0 (2010-10-15) Platform: i386-pc-mingw32/i386 (32-bit) locale: [1] LC_COLLATE=English_Caribbean.1252 LC_CTYPE=English_Caribbean.1252 [3] LC_MONETARY=English_Caribbean.1252 LC_NUMERIC=C [5] LC_TIME=English_Caribbean.1252 attached base packages: [1] stats graphics grDevices utils datasets methods base
memory-management vector matrix r
Benjamin Mar 02 '11 at 18:13 2011-03-02 18:13
source share