The Nadam optimizer does not support tf.keras.optimizers.LearningRateSchedules as the learning rate

485 Views Asked by At

this is my first question here. I'm playing with tensorflow.keras, doing some CNNs, and I would like to know if anyone understands why this conflict arises, thanks.

from tensorflow.keras.optimizers import Nadam
from tensorflow.keras.optimizers.schedules import ExponentialDecay 

initial_learning_rate = 0.1
lr_schedule = ExponentialDecay(
    initial_learning_rate,
    decay_steps=100000, decay_rate=0.96, staircase=True)


model.compile(optimizer=Nadam(learning_rate=lr_schedule), loss='categorical_crossentropy', metrics=['accuracy'])
1

There are 1 best solutions below

0
On

This ValueError: The Nadam optimizer does not support tf.keras.optimizers.LearningRateSchedules as the learning rate is caused because Nadam optimzer does not support LearningRateSchedule as other optimzers do.

You can use other optimizers except Nadam which supports schedules.

  • Adadelta
  • Adagrad
  • Adam
  • Adamx
  • Ftrl
  • RMSprop
  • SGD