As the learning process continues, the losses decrease, with the exception of some fluctuation introduced by the methods of mini-gradient descent and / or regularization, such as loss (which creates random noise).
If the loss decreases, the training process goes well.
Accuracy (I believe credibility), by contrast, is a measure of how good your model forecasts are.
If the model is trained, accuracy increases. If the model is refitted, the accuracy ceases to increase and may even begin to decrease.
If loss decreases and accuracy decreases, your model is refitted.
If losses increase and accuracy increases, this is because your regularization methods work well and you are struggling with the problem of re-equipment. This is true only when the loss begins to decrease, and the accuracy continues to grow. Otherwise, if the losses continue to grow, your model diverges and you should look for the cause (usually you use too high a learning rate).
nessuno Dec 01 '16 at 13:13 2016-12-01 13:13
source share