Neural network input scaling

Do I need to scale input to a neural network? How does this affect the final decision of the neural network?

I tried to find some reliable sources. The book Elements of Statistical Learning (p. 400) says that it will help you choose reasonable initial random weights to start with.

Are the final weights deterministic regardless of the initial random weights we use?

Thanks.

+7
source share
1 answer

Firstly, there are many types of ANNs, I assume that you are talking about the simplest one - a backscatter multilayer perceptron.

Secondly, in your question you mix data scaling (normalization) and weight initialization.

You need to accidentally initialize the scales in order to avoid symmetry during training (if all weights initially match, their update will also be the same). In general, specific values ​​do not matter, but too large values ​​can lead to slower convergence.

You are not required to normalize your data, but normalization can speed up the learning process. See this question for more details.

+17
source

All Articles