I am coding a machine learning procedure that works with large datasets and some other related computations. Since the data sets can be very large, some calculations lead to very large matrices (for example, 29,000 x 29,000 Array {Float64,2}), and they need large amounts of memory (RAM). Later in the procedure, some elements (for example, the original data set) are no longer required, but they still lose memory space.
Is there a way to βfreeβ variables at some point? Or is there instead a way to share part of the hard drive, something like swap space?
julia-lang
dbautistav
source share