I am trying to estimate the average value of log (det (AA T ) + 1) in Python. My simple code works fine until I get 17 × 17 matrices, and at that moment it gives me a math error. Here is the code:
iter = 10000
for n in xrange(1,20):
h = n
dets = []
for _ in xrange(iter):
A = (np.random.randint(2, size=(h,n)))*2-1
detA_Atranspose = np.linalg.det(np.dot(A, A.transpose()))
try:
logdetA_Atranspose = math.log(detA_Atranspose+1,2)
except ValueError:
print "Ooops!", n,detA_Atranspose
dets.append(logdetA_Atranspose)
print np.mean(dets)
A is assumed to be a matrix with elements that are either -1 or 1.
What am I doing wrong and how can I fix this? What is special about 17?
source
share