I'm rewriting tf.contrib.slim.nets.inception_v3
using tf.layers
. Unfortunately the new tf.layers
module does not work with arg_scope
, as it does not have the necessary decorators. Is there better mechanism in place that I should use to set default paramters for layers? Or should I simply add a proper arguments to each layer and remove the arg_scope
?
Here is an example that uses the arg_scope:
with variable_scope.variable_scope(scope, 'InceptionV3', [inputs]):
with arg_scope(
[layers.conv2d, layers_lib.max_pool2d, layers_lib.avg_pool2d],
stride=1,
padding='VALID'):
There isn't another mechanism that lets you define default values in core TensorFlow, so you should specify the arguments for each layer.
For instance, this code:
would become:
Alternatively:
Make sure to read the documentation of the layer to see which initializers default to the variable scope initializer. For example, the dense layer's
kernel_initializer
uses the variable scope initializer, while thebias_initializer
usestf.zeros_initializer()
.