I am interested in calculating the derivative of the matrix determinant using TensorFlow. From experiments, I see that TensorFlow did not implement the differentiation method through the determinant:
LookupError: No gradient defined for operation 'MatrixDeterminant' (op type: MatrixDeterminant)
Further research showed that it is actually possible to calculate the derivative; see, for example, the Jacobi formula . I decided that to implement this means of differentiation through the determinant that I need to use a function decorator,
@tf.RegisterGradient("MatrixDeterminant") def _sub_grad(op, grad): ...
However, I am not familiar with tensor flow to understand how this can be done. Does anyone have an idea about this issue?
Here is an example where I ran into this problem:
x = tf.Variable(tf.ones(shape=[1])) y = tf.Variable(tf.ones(shape=[1])) A = tf.reshape( tf.pack([tf.sin(x), tf.zeros([1, ]), tf.zeros([1, ]), tf.cos(y)]), (2,2) ) loss = tf.square(tf.matrix_determinant(A)) optimizer = tf.train.GradientDescentOptimizer(0.001) train = optimizer.minimize(loss) init = tf.initialize_all_variables() sess = tf.Session() sess.run(init) for step in xrange(100): sess.run(train) print sess.run(x)
python tensorflow determinants
user1936768
source share