I am using the SGD optimizer and want to set the momentum after initialization similar to learning rate scheduling by using tf.keras.backend.set_value(optimizer.momentum, momentumValue)
: https://www.tensorflow.org/api_docs/python/tf/keras/backend/set_value
However, all I get is following error:
AttributeError: 'LossScaleOptimizer' object has no attribute 'momentum'
Is there any way to set the momentum? This is important for implementing the 1Cycle Policy as the momentum should also cycle, but I can´t believe there is no way to adjust the momentum in keras after initialization.