I wonder, is there an easy way?
For example, changing learning rate can be easily done using tf.keras.optimizers.schedules
:
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(0.001)
optimizer = tf.keras.optimizers.SGD(learning_rate=lr_schedule)
Is there an easy way to do the same with regularization factor? Like this:
r_schedule = tf.keras.optimizers.schedules.ExponentialDecay(0.1)
regularizer = tf.keras.regularizers.L2(l2=r_schedule)
If not, how can I gradually change regularization factor with minimal effort?
IIUC, I think you should be able to use a custom callback and implement the same / similar logic used by
tf.keras.optimizers.schedules.ExponentialDecay
(but it could go beyond minimal effort):