TensorFlow: How to change keep_prob for Dropout without using a feed_dict

845 Views Asked by At

I have build a TensorFlow model that works with training and test batches provided by Input Queues. Thus, I am not explicitly feeding data for training using the standard feed_dict. Nevertheless, I need to implement dropout which requires a keep_prob placeholder to turn off dropout during testing.

I can't find how to solve this without a separate model. Are there any suggestions?

Thanks

2

There are 2 best solutions below

3
On

Let's say that your tensor layer1 is defined as follows:

layer1 = tf.nn.relu(tf.matmul(w,x)+b)

to apply dropout, you just do

dropout_layer1 = tf.nn.dropout(layer1, keep_prob)

where your keep_prob is defined somewhere, and I usually control it with FLAGS but you can use a normal declaration from inside the program. Then you can use dropout_layer1 as a normal tensor. Here you have a trivial example about using it:

import tensorflow as tf
import numpy as np
tf.reset_default_graph()

keep_prob = 0.5

a = tf.get_variable('a', initializer=np.random.normal())
b = tf.get_variable('b', initializer=np.random.normal())

x=[0.,1.,2.,3.,4.,5.,6.,7.,8.,9.]
y=list(map(lambda i: i+np.random.normal(0, 0.1), x))
f=tf.multiply(x,a)+b
f_dropout = tf.nn.dropout(f,keep_prob)

loss = tf.reduce_sum(tf.pow(f_dropout-y, 2))
train = tf.train.GradientDescentOptimizer(0.001).minimize(loss)

sess = tf.Session() #
init = tf.global_variables_initializer()
sess.run(init)

for i in range(1000):
    _, l, slope, intercept = sess.run([train, loss, a, b])
    print(list(map(lambda i: i*slope+intercept, x)))
    print('a: %.2f' %slope)
    print('b: %.2f' %intercept)

This is a bad example from the regression point of view but it shows how to program dropout and it easily inferes about what dropout does. I hope you will enjoy it :)

0
On

My question is obsolete, I made it too complicated.

One can still feed values to placeholders through feed_dict when calling sess.run() even though there is an input queue providing training examples directly.