TensorFlow Cross-Entropy in the tutorial

I just went through the tutorial TensorFlow( https://www.tensorflow.org/versions/r0.8/tutorials/mnist/pros/index.html#deep-mnist-for-experts ).

I have two questions:

  • Why is he using cost function with y_ * log(y)? Shouldn't it be y_ * log(y) + (1-y_) * log(1-y)?

  • Does TensorFlowanyone know how to calculate gradientfor cost functionwhich I use? Shouldn't we say somewhere TensorFlowhow to calculate gradient?

Thank!

+4
source share
1 answer
  • For y = 1 or 0 you can use y_ * log (y) + (1-y_) * log (1-y), but when y is a one-time encoding, y = [0 1] or [1 0], we use y_ * log (y). In fact, they are the same.

  • TensorFlow, .

enter image description here

, node . Tensorflow backpropagation ( ) .

+5

All Articles