I am working on a task in which a temporary hash table is reused through a loop. The Hash table is represented by the environment variable in R. The problem is that as the cycle continues, the cost of memory continues to rise no matter what method I used to delete the table (I tried rm() and gc() , however, none of them could free up memory.) As a result, I cannot execute an unusual long cycle, say, 10M cycles. This seems like a memory leak problem, but I cannot find a solution elsewhere. I would like to ask what is the right way to completely remove the environment variable and at the same time free up all the memory that it previously occupied. Thanks in advance for helping me verify the problem for me.
Here is a very simple example. I am using Windows 8 and R version 3.1.0.
> fun = function(){ H = new.env() for(i in rnorm(100000)){ H[[as.character(i)]] = rnorm(100) } rm(list=names(H), envir=H, inherits=FALSE) rm(H) gc() } > > for(k in 1:5){ print(k) fun() gc() print(memory.size(F)) } [1] 1 [1] 40.43 [1] 2 [1] 65.34 [1] 3 [1] 82.56 [1] 4 [1] 100.22 [1] 5 [1] 120.36
source share