Replace stride layers in MobileNet application in Keras

549 Views Asked by At

I would like to apply in Keras MobileNetV2 on images of size 39 x 39 to classify 3 classes. My images represent heat maps (e.g. what keys have been pressed on the keyboard). I think MobileNet was designed to work on images of size 224 x 224. I will not use transfer learning but train the model from scratch.

To make MobileNet work on my images, I would like to replace the first three stride 2 convolutions with stride 1. I have the following code:

from tensorflow.keras.applications import MobileNetV2

base_model = MobileNetV2(weights=None, include_top=False, 
                         input_shape=[39,39,3])
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dropout(0.5)(x)
output_tensor = Dense(3, activation='softmax')(x)
cnn_model = Model(inputs=base_model.input, outputs=output_tensor)

opt = Adam(lr=learning_rate)
cnn_model.compile(loss='categorical_crossentropy', 
             optimizer=opt, metrics=['accuracy', tf.keras.metrics.AUC()])

How can I replace the first three stride 2 convolutions with stride 1 without building MobileNet myself?

1

There are 1 best solutions below

2
On BEST ANSWER

Here is one workaround for your need but I think probably it's possible to have a more general approach. However, in the MobileNetV2, there is only one conv layer with strides 2. If you follow the source code, here

 x = layers.Conv2D(
      first_block_filters,
      kernel_size=3,
      strides=(2, 2),
      padding='same',
      use_bias=False,
      name='Conv1')(img_input)
  x = layers.BatchNormalization(
      axis=channel_axis, epsilon=1e-3, momentum=0.999, name='bn_Conv1')(
          x)
  x = layers.ReLU(6., name='Conv1_relu')(x)

And the rest of the blocks are defined as follows

  x = _inverted_res_block(
      x, filters=16, alpha=alpha, stride=1, expansion=1, block_id=0)
  x = _inverted_res_block(
      x, filters=24, alpha=alpha, stride=2, expansion=6, block_id=1)
  x = _inverted_res_block(
      x, filters=24, alpha=alpha, stride=1, expansion=6, block_id=2)

So, here I will deal with the first conv with stride=(2, 2). The idea is simple, we will add a new layer in the right place of the built-in model and then remove the desired layer.

def _make_divisible(v, divisor, min_value=None):
    if min_value is None:
        min_value = divisor
    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)
    # Make sure that round down does not go down by more than 10%.
    if new_v < 0.9 * v:
        new_v += divisor
    return new_v
alpha = 1.0
first_block_filters = _make_divisible(32 * alpha, 8)

inputLayer = tf.keras.Input(shape=(39, 39, 3), name="inputLayer")
inputcOonv = tf.keras.layers.Conv2D(
                first_block_filters,
                kernel_size=3,
                strides=(1, 1),
                padding='same',
                use_bias=False,
                name='Conv1_'
        )(inputLayer)

The above _make_divisible function simply derived from the source code. Anyway, now we impute this layer to the MobileNetV2 right before the first conv layer, as follows:

base_model = tf.keras.applications.MobileNetV2(weights=None, 
                            include_top=False, 
                            input_tensor = inputcOonv)
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dropout(0.5)(x)
output_tensor = Dense(3, activation='softmax')(x)
cnn_model = Model(inputs=base_model.input, outputs=output_tensor)

Now, if we observe

for i, l in enumerate(cnn_model.layers):
    print(l.name, l.output_shape)
    if i == 8: break

inputLayer [(None, 39, 39, 3)]
Conv1_ (None, 39, 39, 32)
Conv1 (None, 20, 20, 32)
bn_Conv1 (None, 20, 20, 32)
Conv1_relu (None, 20, 20, 32)
expanded_conv_depthwise (None, 20, 20, 32)
expanded_conv_depthwise_BN (None, 20, 20, 32)
expanded_conv_depthwise_relu (None, 20, 20, 32)
expanded_conv_project (None, 20, 20, 16)

Layer name Conv1_ and Conv1 are the new layer (with strides = 1) and old layer (with strides = 2) respectively. And as we need, now we remove layer Conv1 with strides = 2 as follows:

cnn_model._layers.pop(2) # remove Conv1

for i, l in enumerate(cnn_model.layers):
    print(l.name, l.output_shape)
    if i == 8: break

inputLayer [(None, 39, 39, 3)]
Conv1_ (None, 39, 39, 32)
bn_Conv1 (None, 20, 20, 32)
Conv1_relu (None, 20, 20, 32)
expanded_conv_depthwise (None, 20, 20, 32)
expanded_conv_depthwise_BN (None, 20, 20, 32)
expanded_conv_depthwise_relu (None, 20, 20, 32)
expanded_conv_project (None, 20, 20, 16)
expanded_conv_project_BN (None, 20, 20, 16)

Now, you have cnn_model model with strides = 1 on its first conv layer. However, in case you're wondering about this approach and possible issue, please see my other answer related to this one. Remove first N layers from a Keras Model?