Failed to build the correct ffnn on pybrain

I trained ffnn to match an unknown function using pybrain. I create ffnn as follows

net = buildNetwork(1, 2, 1,hiddenclass=TanhLayer)

I told pybrain to print network parameters using command

print net.params

and pybrain will return me the parameters

(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)

Now I want to use this function in another script. I tried

def netp(Q):
    net = buildNetwork(1, 2, 1,hiddenclass=TanhLayer)
    net._setParameters=(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)
    arg=1.0/float(Q)
    p=float(net.activate([arg]))
    return p

The problem is that the values ​​returned from the networks are completely crazy. example

 0.0749046652125 1.0
-2.01920546405 0.5
-1.54408069672 0.333333333333
 1.05895945271 0.25
-1.01314347373 0.2
 1.56555648799 0.166666666667
 0.0824497539453 0.142857142857
 0.531176423655 0.125
 0.504185707604 0.111111111111
 0.841424535805 0.1

where is the first column if the output is network and the second is input. The output of the network should be close to the input value. What is the problem? Where am I doing wrong? Is this a more suitable problem or am I missing something?

+4
source share
1 answer

Typo:

net._setParameters=(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)

_setParamethers .

net._setParameters([1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959])

.

-, 1/Q,

>>> def netp(Q): return float(net.activate([Q]))
>>> for i in inp:
...   print '{}\t{:.5f}'.format(i, netp(i))

1.0      0.97634
0.5      0.46546
0.33333  0.29013
0.25     0.20762
0.2      0.16058
0.16666  0.13042
0.14285  0.10952
0.125    0.09421
0.11111  0.08254
0.1      0.07335
+2

All Articles