Forward pass output of a pertained network changes without back propagation

101 Views Asked by At

I am using Chainer's pertained model vgg (here named net). Every time I run the following code, I get a different result:

img = Image.open("/Users/macintosh/Desktop/Code/Ger.jpg")
img = Variable(vgg.prepare(img))
img = img.reshape((1,) + img.shape)
print(net(img,layers=['prob'])['prob'])

I have checked vgg.prepare() several times but its output is the same, and there is no random initialization here (net is a pre-trained vgg network). So why is this happening?

1

There are 1 best solutions below

0
On BEST ANSWER

As you can see VGG implementation, it has dropout function. I think this causes the randomness.

When you want to forward the computation in evaluation mode (instead of training mode), you can set chainer config 'train' to False as follows:

with chainer.no_backprop_mode(), chainer.using_config('train', False):
    result = net(img,layers=['prob'])['prob']

when train flag is False, dropout is not executed (and some other function behaviors also change, e.g., BatchNormalization uses trained statistics).