I recently discovered the wonders of bigmemory , ff and filehash to handle very large matrices.
How can I handle very large (300 MB ++) lists? In my work, I work with these lists all day every day. I can use the group support solution with save() and load() hacks, but I would prefer a bigmemory solution. Something like bigmemory bigmatrix would be ideal where I work with it mostly identically with matrix , except that it takes about 660 bytes in my RAM.
These lists are basically >1000 lists of lengths of lm() objects (or similar regression objects). For example,
Y <- rnorm(1000) ; X <- rnorm(1000) A <- lapply(1:6000, function(i) lm(Y~X)) B <- lapply(1:6000, function(i) lm(Y~X)) C <- lapply(1:6000, function(i) lm(Y~X)) D <- lapply(1:6000, function(i) lm(Y~X)) E <- lapply(1:6000, function(i) lm(Y~X)) F <- lapply(1:6000, function(i) lm(Y~X))
In my project, I will have A,B,C,D,E,F types (and even more than that) that I have to work interactively.
If these were giant matrices, there is a ton of support. I was wondering if there was any similar support in any package for large list objects.
memory-management list memory r filehash
Jase
source share