Neural network: handling inaccessible inputs (missing or incomplete data)

Hopefully the last NN question you get from me this weekend, but here :)

Is there a way to handle an input that you “don’t always know” ... so that doesn’t affect the scale in any way?

Su ... if I ask someone if they are men or women and they would not like to answer, is there any way to ignore this contribution? Perhaps placing it right in the center? (assuming 1.0 entry at 0.5?)

thanks

+7
machine-learning neural-network
source share
2 answers

You probably know this or suspect it, but there is no statistical basis for guessing or presenting missing values ​​by averaging over the range of possible values, etc.

In particular, for NN there are many methods that can be used. The technique that I use - which I encoded - is one of the simplest methods, but it has a solid statistical basis and is still in use today. An academic article describing it here .

The theory underlying this method is weighted integration over incomplete data. In practice, no integrals are estimated, and they are approximated by closed-loop solutions of Gaussian networks of basic functions. As you will see in the document (this is a step-by-step explanation, it is simply implemented in your backprop algorithm.

+4
source share

Neural networks are quite noise resistant - this is one of their biggest advantages. You might want to try inserting inputs (-1.0,1.0), and 0 as input without input. Thus, the input to the weights from this neuron is 0.0, which means that no training will be there.

Probably the best book I've ever experienced was the misfortune of not ending (yet!), These are the neural networks and training machines of Simon S. Haikin. In it, he talks about all kinds of problems, including how you should distribute your materials / study guides for better preparation, etc. This is a really great book!

+2
source share

All Articles