These are two different metrics for evaluating the performance of your model, which are usually used at different stages.
Losses are often used in the training process to find the “best” parameter values ​​for your model (for example, weights in a neural network). This is what you are trying to optimize in training by updating weights.
Accuracy is more practical. Once you find the optimized parameters above, you use these metrics to evaluate how accurate your model's forecast compares with the true data.
Let's use the toy classification example. You want to predict gender from one weight and height. You have 3 data, they are as follows: (0 stands for man, 1 stands for woman)
y1 = 0, x1_w = 50 kg, x2_h = 160 cm;
y2 = 0, x2_w = 60 kg, x2_h = 170 cm;
y3 = 1, x3_w = 55 kg, x3_h = 175 cm;
You are using a simple logistic regression model: y = 1 / (1 + exp- (b1 * x_w + b2 * x_h))
How do you find b1 and b2? you first determine the losses and use the optimization method to minimize losses in an iterative way by updating b1 and b2.
In our example, a typical loss for this binary classification task can be: (a minus sign must be added before the summation sign)

We do not know what b1 and b2 should be. Let's make a random assumption, say b1 = 0.1 and b2 = -0.03. Then what is our loss now?



so the loss

You will then study an algorithm (e.g. gradient descent) to find a way to update b1 and b2 to reduce losses.
What if b1 = 0.1 and b2 = -0.03 - these are the last b1 and b2 (gradient descent output), what is the accuracy now?
Suppose that if y_hat> = 0.5, we decide that our prognosis is female (1). otherwise it will be 0. Therefore, our algorithm predicts y1 = 1, y2 = 1 and y3 = 1. What is our accuracy? We make the wrong forecast for y1 and y2 and make the correct forecast for y3. So now our accuracy is 1/3 = 33.33%
PS: in Amir’s answer, back propagation is called an optimization method in NN. I think this will be seen as a way to find the gradient for weights in NN. A common optimization method in NN is GradientDescent and Adam.