Network training
You must use each instance of the training set once per school age.
The learning age is a complete cycle through your data set.
After you are fixated on a data set and calculated a delta, you should configure network scales. You can then complete a new anterior neural network pass and do another training era by going through your training dataset.
Graphical representation
A really great graphical representation of backpropagation can be found here.
One-step training
There are two approaches to training a network to perform classification in a dataset. The simplest method is called one-step or online learning. This is the method you'll find in most litters, and it also converges faster. When you train your network, you will calculate the delta for each layer and adjust the weights for each instance of your dataset.
Thus, if you have a data set of 60 copies, this means that you had to adjust the scales 60 times before the end of the school age.
Batch training
Another approach is called batch learning or offline learning. This approach often gives a network with a lower residual error. When you train the network, you must calculate the delta for each layer for each instance of the dataset, and then finally average the individual deltas and correct the scales once per era.
If you have a dataset of 60 copies, this means that you had to adjust the weights once before the end of the school age.
jorgenkg
source share