How to interpret an increase in both loss and accuracy

I have deep learning models (CNNs) using tensor flow. Many times in the era I observed that both loss and accuracy increased, or both decreased. My understanding was that both of them are always inversely related. What could be the scenario when both simultaneously increase or decrease.

+13
deep-learning tensorflow
Dec 01 '16 at 12:35
source share
1 answer

As the learning process continues, the losses decrease, with the exception of some fluctuation introduced by the methods of mini-gradient descent and / or regularization, such as loss (which creates random noise).

If the loss decreases, the training process goes well.

Accuracy (I believe credibility), by contrast, is a measure of how good your model forecasts are.

If the model is trained, accuracy increases. If the model is refitted, the accuracy ceases to increase and may even begin to decrease.

If loss decreases and accuracy decreases, your model is refitted.

If losses increase and accuracy increases, this is because your regularization methods work well and you are struggling with the problem of re-equipment. This is true only when the loss begins to decrease, and the accuracy continues to grow. Otherwise, if the losses continue to grow, your model diverges and you should look for the cause (usually you use too high a learning rate).

+34
Dec 01 '16 at 13:13
source share



All Articles