I am using numpy and trying to create a huge matrix. In doing so, I get a memory error
Since the matrix is not important, I will just show how easy it is to reproduce the error.
a = 10000000000
data = np.array([float('nan')] * a)
No wonder it throws me MemoryError
There are two things I would like to say:
- I really need to create and use a large matrix
- I think I have enough RAM to process this matrix (I have 24 GB or RAM).
Is there an easy way to handle large matrices in numpy?
Just to be safe, I previously read these posts (which sounds similar):
Very large matrices using Python and NumPy
Python / Numpy MemoryError
Handling a very large dataset in a python error - memory
P.S. -, , , . , , , , - .
, - , , 24 .
. , , pytables.