I am training a custom model that includes BatchNormalization
layers in TensorFlow 2.12. Along training, I save some checkpoints like this:
checkpoint = tf.train.Checkpoint(step=tf.Variable(0), optimizer=optimizer, model=model)
manager = tf.train.CheckpointManager(checkpoint, os.path.join(output_folder,"ckpts"), max_to_keep=3)
manager.save(checkpoint_number=int(checkpoint.step)+1)
After the model is trained, I load the checkpoints like this
checkpoint.restore(manager.latest_checkpoint).expect_partial()
The weights are loaded into the model successfully. However, the non-trainable variables moving_variance
and moving_mean
of the BatchNormalization
layers are reset to their default values, 1 and 0 respectively.
Why are the checkpoints not saving the values of moving_variance
and moving_mean
calculated during training?