In every example that I saw for Encog neural networks, XOR or something very simple was involved. I have about 10,000 sentences, and each word in a sentence has some type of tag. The input level must accept 2 inputs, the previous word and the current word. If there is no previous word, then the first input is not activated at all. I need to go through every sentence like that. Each word depends on the previous word, so I cannot just have an array similar to the XOR example. Also, I really don't want to load all the words from 10,000 sentences into an array, I would prefer to scan one sentence at a time, and as soon as I reach EOF, start from the beginning.
How can I do it? I am not very comfortable with Encog, because all the examples that I saw were either XOR or extremely complex.
There are 2 inputs ... Each input consists of 30 neurons. The probability that the word is a specific tag is used as input. Thus, most neurons get 0; others get probabilistic inputs, such as .5, .3 and .2. When I say “not activated”, I mean that all neurons are set to 0. The output level represents all possible tags, so its value is 30. Regardless of which of the output neurons has the largest number, this is the tag that is selected .
I’m not sure how to go through all 10,000 sentences and look for every word in every sentence (for input and activate this input) in the Encog “demo files” that I saw.)
It seems that the networks are trained using a single array containing all the training data, and this loops until the network is trained. I would like to train the network with many different arrays (array per sentence) and then look at them again.
This format obviously does not work for what I am doing:
do { train.iteration(); System.out.println( "Epoch #" + epoch + " Error:" + train.getError()); epoch++; } while(train.getError() > 0.01);