Unable to build the correct ffnn on pybrain

244 Views Asked by At

I have trained a ffnn to fit a unknown function with pybrain. I build the ffnn like this

net = buildNetwork(1, 2, 1,hiddenclass=TanhLayer)

I said to pybrain to print the params of the net with the command

print net.params

and pybrain return me the params

(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)

now I want to use this fitted function in another script. I tried

def netp(Q):
    net = buildNetwork(1, 2, 1,hiddenclass=TanhLayer)
    net._setParameters=(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)
    arg=1.0/float(Q)
    p=float(net.activate([arg]))
    return p

The problem is that the values returned from the nets are completely out of mind. example

 0.0749046652125 1.0
-2.01920546405 0.5
-1.54408069672 0.333333333333
 1.05895945271 0.25
-1.01314347373 0.2
 1.56555648799 0.166666666667
 0.0824497539453 0.142857142857
 0.531176423655 0.125
 0.504185707604 0.111111111111
 0.841424535805 0.1

where the first column if the output of the net, and the second the input. The output of the net has to be close to the input value. What's the problem? Where I am doing wrong? It's a problem of over fitting or a I am missing something?

1

There are 1 best solutions below

6
On BEST ANSWER

A typo:

net._setParameters=(1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959)

This line effectively replaces private _setParamethers method with a tuple. Try if replacing this line with

net._setParameters([1.76464967 , 0.46764103 , 1.63394395 ,-0.95327762 , 1.19760151, -1.20449402, -1.34050959])

will help.

Second, don't see reasons for 1/Q operation, so simple

>>> def netp(Q): return float(net.activate([Q]))
>>> for i in inp:
...   print '{}\t{:.5f}'.format(i, netp(i))

yields

1.0      0.97634
0.5      0.46546
0.33333  0.29013
0.25     0.20762
0.2      0.16058
0.16666  0.13042
0.14285  0.10952
0.125    0.09421
0.11111  0.08254
0.1      0.07335