I want to pass weights to tensorflow.contrib.layers.conv2d
.
The layers have the parameter weights_initializer
. When passing the tensor via weights_initializer=tf.constant_initializer(tensor)
, the tensor is additionally added as a node to the graph, causing the size of the model to increase.
Is there an alternative to this weight initialization?
I know that tf.nn.conv2d
accepts the weights as a parameter. The current model I am working with, however, uses the contrib-layers.
If you want to initialize the weights to some constant but you don't want to store that constant in the graph, can use a placeholder and feed a value for it on initialization. Just have something like:
Note the shape of
weight_init
must match the size of the weights tensor. Then, on initialization:Alternatively, you can use no initializer and, instead of calling an initialization op, use the
load
method of the weight variable. For this you would have to access that variable first: