Does SVD instantly cause a memory error?

I am trying to apply SVD to my matrix (3241 x 12596), which was obtained after some text processing (with the ultimate goal of performing hidden semantic analysis), and I cannot understand why this is happening as my 64-bit machine has 16 GB of RAM . The moment is called svd(self.A), it gives an error. The exact error is given below:

Traceback (most recent call last):
  File ".\SVD.py", line 985, in <module>
    _svd.calc()
  File ".\SVD.py", line 534, in calc
    self.U, self.S, self.Vt = svd(self.A)
  File "C:\Python26\lib\site-packages\scipy\linalg\decomp_svd.py", line 81, in svd
    overwrite_a = overwrite_a)
MemoryError

So I tried to use

self.U, self.S, self.Vt = svd(self.A, full_matrices= False)

and this time it produces the following error:

Traceback (most recent call last):
  File ".\SVD.py", line 985, in <module>
    _svd.calc()
  File ".\SVD.py", line 534, in calc
    self.U, self.S, self.Vt = svd(self.A, full_matrices= False)
  File "C:\Python26\lib\site-packages\scipy\linalg\decomp_svd.py", line 71, in svd
    return numpy.linalg.svd(a, full_matrices=0, compute_uv=compute_uv)
  File "C:\Python26\lib\site-packages\numpy\linalg\linalg.py", line 1317, in svd
    work = zeros((lwork,), t)
MemoryError

Is this supposed to be such a large matrix that Numpy cannot handle, and is there something I can do at this point without changing the methodology itself?

+5
source share
2

-, @Ferdinand Beyer, , 32- Python 64- .

64- Python .

+2

, full_matrices scipy.linalg.svd : ( . 3,241), 12 596 x 12 596 V!

, , , . scipy.linalg.svd SVD, ) ) .

sparseSVD PyPI, , top K, scipy.sparse.linalg.svd, scipy.

, , , LSA , gensim.

+8

All Articles