This is because Loss and Accuracy are two completely different things (at least logically)!
Consider an example where you define loss as:
loss = (1-accuracy)
In this case, when you try to minimize loss , accuracy automatically increases.
Now consider another example in which you define loss as:
loss = average(prediction_probabilities)
Although this does not make any sense, it is technically still a valid loss function, and your weights is still set up to minimize such loss .
But, as you can see, in this case there is no relationship between loss and accuracy , so you cannot expect a simultaneous increase / decrease.
Note: loss will always be minimized (thus, your loss reduced after each iteration)!
PS: update your question with the loss function that you are trying to minimize.
source share